转自:https://www.cnblogs.com/rhzhang/p/5185686.html
写在前面
我是个讲文明的人…… 不过有的时候实在忍不住了也要吐槽几句:
1. 我真是跟不上时代,到现在了还在研究 Windows 应用开发…… 咳;
2. DirectShow 是傻X!我只是想要获取 Camera 裸数据,尼玛却要让我学习神马各种 .ax, filter, graph... 相关资料少、又晦涩;
3. 在此祝愿 Windows XP 及其之前的版本早点退出历史舞台,这样 DirectShow 就不是必须的了!
音视频采集
列举设备并设置设备
通过 Source Reader 读取媒体
相关文档:
https://msdn.microsoft.com/en-us/library/dd743690.aspx
https://msdn.microsoft.com/en-us/library/dd317912.aspx
https://msdn.microsoft.com/en-us/library/dd940326.aspx
https://msdn.microsoft.com/en-us/library/dd940328.aspx (设备使用时中途丢失的处理,待学)
https://msdn.microsoft.com/en-us/library/ee663602.aspx
https://msdn.microsoft.com/en-us/library/aa473818.aspx (媒体类型)
其他功能
音视频采集 + 编码(Transcode): https://msdn.microsoft.com/en-us/library/ff485863.aspx
播放媒体文件:https://msdn.microsoft.com/en-us/library/ms703190.aspx
代码地址
在 Windows SDK 的 Samples\multimedia\mediafoundation 目录中
Enumerating Video Capture Devices
This topic describes how to enumerate the video capture devices on the user's system, and how create an instance of a device.
To enumerate the video capture devices on the system, do the following:
- Call MFCreateAttributes to create an attribute store. This function receives anIMFAttributes pointer.
- Call IMFAttributes::SetGUID to set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE attribute. Set the attribute value to MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID.
- Call MFEnumDeviceSources. This function receives an array ofIMFActivate pointers and the array size. Each pointer represents a distinct video capture device.
To create an instance of a capture device:
- Call IMFActivate::ActivateObject to get a pointer to theIMFMediaSource interface.
The following code shows these steps:
HRESULT CreateVideoDeviceSource(IMFMediaSource **ppSource) { *ppSource = NULL; IMFMediaSource *pSource = NULL; IMFAttributes *pAttributes = NULL; IMFActivate **ppDevices = NULL; // Create an attribute store to specify the enumeration parameters. HRESULT hr = MFCreateAttributes(&pAttributes, 1); if (FAILED(hr)) { goto done; } // Source type: video capture devices hr = pAttributes->SetGUID( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID ); if (FAILED(hr)) { goto done; } // Enumerate devices. UINT32 count; hr = MFEnumDeviceSources(pAttributes, &ppDevices, &count); if (FAILED(hr)) { goto done; } if (count == 0) { hr = E_FAIL; goto done; } // Create the media source object. hr = ppDevices[0]->ActivateObject(IID_PPV_ARGS(&pSource)); if (FAILED(hr)) { goto done; } *ppSource = pSource; (*ppSource)->AddRef(); done: SafeRelease(&pAttributes); for (DWORD i = 0; i < count; i++) { SafeRelease(&ppDevices[i]); } CoTaskMemFree(ppDevices); SafeRelease(&pSource); return hr; }
After you create media source, release the interface pointers and free the memory for the array:
SafeRelease(&pAttributes);
for (DWORD i = 0; i < count; i++)
{
SafeRelease(&ppDevices[i]);
}
CoTaskMemFree(ppDevices);
Related topics
Audio/Video Capture in Media Foundation
Microsoft Media Foundation supports audio and video capture. Video capture devices are supported through the UVC class driver and must be compatible with UVC 1.1. Audio capture devices are supported through Windows Audio Session API (WASAPI).
A capture device is represented in Media Foundation by a media source object, which exposes the IMFMediaSource interface. In most cases, the application will not use this interface directly, but will use a higher-level API such as the Source Reader to control the capture device.
Enumerate Capture Devices
To enumerate the capture devices on the system, perform the following steps:
- Call the MFCreateAttributes function to create an attribute store.
- Set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE attribute to one of the following values:
Value Description MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_GUID Enumerate audio capture devices. MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID Enumerate video capture devices. - Call the MFEnumDeviceSources function. This function allocates an array of IMFActivate pointers. Each pointer represents an activation object for one device on the system.
- Call the IMFActivate::ActivateObject method to create an instance of the media source from one of the activation objects.
The following example creates a media source for the first video capture device in the enumeration list:
HRESULT CreateVideoCaptureDevice(IMFMediaSource **ppSource) { *ppSource = NULL; UINT32 count = 0; IMFAttributes *pConfig = NULL; IMFActivate **ppDevices = NULL; // Create an attribute store to hold the search criteria. HRESULT hr = MFCreateAttributes(&pConfig, 1); // Request video capture devices. if (SUCCEEDED(hr)) { hr = pConfig->SetGUID( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID ); } // Enumerate the devices, if (SUCCEEDED(hr)) { hr = MFEnumDeviceSources(pConfig, &ppDevices, &count); } // Create a media source for the first device in the list. if (SUCCEEDED(hr)) { if (count > 0) { hr = ppDevices[0]->ActivateObject(IID_PPV_ARGS(ppSource)); } else { hr = MF_E_NOT_FOUND; } } for (DWORD i = 0; i < count; i++) { ppDevices[i]->Release(); } CoTaskMemFree(ppDevices); return hr; }
You can query the activation objects for various attributes, including the following:
- The MF_DEVSOURCE_ATTRIBUTE_FRIENDLY_NAME attribute contains the display name of the device. The display name is suitable for showing to the user, but might not be unique.
- For video devices, the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK attribute contains the symbolic link to the device. The symbolic link uniquely identifies the device on the system, but is not a readable string.
- For audio devices, the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_ENDPOINT_ID attribute contains the audio endpoint ID of the device. The audio endpoint ID is similar to a symbolic link. It uniquely identifies the device on the system, but is not a readable string.
The following example takes an array of IMFActivate pointers and prints the display name of each device to the debug window:
void DebugShowDeviceNames(IMFActivate **ppDevices, UINT count) { for (DWORD i = 0; i < count; i++) { HRESULT hr = S_OK; WCHAR *szFriendlyName = NULL; // Try to get the display name. UINT32 cchName; hr = ppDevices[i]->GetAllocatedString( MF_DEVSOURCE_ATTRIBUTE_FRIENDLY_NAME, &szFriendlyName, &cchName); if (SUCCEEDED(hr)) { OutputDebugString(szFriendlyName); OutputDebugString(L"\n"); } CoTaskMemFree(szFriendlyName); } }
If you already know the symbolic link for a video device, there is another way to create the media source for the device:
- Call MFCreateAttributes to create an attribute store.
- Set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE attribute to MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID.
- Set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK attribute to the symbolic link.
- Call either the MFCreateDeviceSource or MFCreateDeviceSourceActivate function. The former returns an IMFMediaSource pointer. The latter returns an IMFActivate pointer to an activation object. You can use the activation object to create the source. (An activation object can be marshaled to another process, so it is useful if you want to create the source in another process. For more information, see Activation Objects.)
The following example takes the symbolic link of a video device and creates a media source.
HRESULT CreateVideoCaptureDevice(PCWSTR *pszSymbolicLink, IMFMediaSource **ppSource) { *ppSource = NULL; IMFAttributes *pAttributes = NULL; IMFMediaSource *pSource = NULL; HRESULT hr = MFCreateAttributes(&pAttributes, 2); // Set the device type to video. if (SUCCEEDED(hr)) { hr = pAttributes->SetGUID( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID ); } // Set the symbolic link. if (SUCCEEDED(hr)) { hr = pAttributes->SetString( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK, (LPCWSTR)pszSymbolicLink ); } if (SUCCEEDED(hr)) { hr = MFCreateDeviceSource(pAttributes, ppSource); } SafeRelease(&pAttributes); return hr; }
There is an equivalent way to create an audio device from the audio endpoint ID:
- Call MFCreateAttributes to create an attribute store.
- Set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE attribute to MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_GUID.
- Set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_ENDPOINT_ID attribute to the endpoint ID.
- Call either the MFCreateDeviceSource or MFCreateDeviceSourceActivate function.
The following example takes an audio endpoint ID and creates a media source.
HRESULT CreateAudioCaptureDevice(PCWSTR *pszEndPointID, IMFMediaSource **ppSource) { *ppSource = NULL; IMFAttributes *pAttributes = NULL; IMFMediaSource *pSource = NULL; HRESULT hr = MFCreateAttributes(&pAttributes, 2); // Set the device type to audio. if (SUCCEEDED(hr)) { hr = pAttributes->SetGUID( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_GUID ); } // Set the endpoint ID. if (SUCCEEDED(hr)) { hr = pAttributes->SetString( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_ENDPOINT_ID, (LPCWSTR)pszEndPointID ); } if (SUCCEEDED(hr)) { hr = MFCreateDeviceSource(pAttributes, ppSource); } SafeRelease(&pAttributes); return hr; }
Use a capture device
After you create the media source for a capture device, use the Source Reader to get data from the device. The Source Reader delivers media samples that contain the capture audio data or video frames. The next step depends on your application scenario:
- Video preview: Use Microsoft Direct3D or Direct2D to display the video.
- File capture: Use the Sink Writer to encode the file.
- Audio preview: Use WASAPI.
If you want to combine audio capture with video capture, use the aggregate media source. The aggregate media source contains a collection of media sources and combines all of their streams into a single media source object. To create an instance of the aggregate media source, call the MFCreateAggregateSource function.
Shut down the capture device
When the capture device is no longer needed, you must shut down the device by calling Shutdown on the IMFMediaSource object you obtained by calling MFCreateDeviceSource or IMFActivate::ActivateObject. Failure to call Shutdown can result in memory links because the system may keep a reference to IMFMediaSource resources until Shutdown is called.
if (g_pSource)
{
g_pSource->Shutdown();
g_pSource->Release();
g_pSource = NULL;
}
If you allocated a string containing the symbolic link to a capture device, you should release this object as well.
CoTaskMemFree(g_pwszSymbolicLink); g_pwszSymbolicLink = NULL; g_cchSymbolicLink = 0;
Related topics
Handling Video Device Loss
This topic describes how to detect device loss when using a video capture device. It contains the following sections:
- Register For Device Notification
- Get the Symbolic Link of the Device
- Handle WM_DEVICECHANGE
- Unregister For Notification
- Related topics
Register For Device Notification
Before you start capturing from the device, call the RegisterDeviceNotification function to register for device notifications. Register for the KSCATEGORY_CAPTURE device class, as shown in the following code.
#include <Dbt.h> #include <ks.h> #include <ksmedia.h> HDEVNOTIFY g_hdevnotify = NULL; BOOL RegisterForDeviceNotification(HWND hwnd) { DEV_BROADCAST_DEVICEINTERFACE di = { 0 }; di.dbcc_size = sizeof(di); di.dbcc_devicetype = DBT_DEVTYP_DEVICEINTERFACE; di.dbcc_classguid = KSCATEGORY_CAPTURE; g_hdevnotify = RegisterDeviceNotification( hwnd, &di, DEVICE_NOTIFY_WINDOW_HANDLE ); if (g_hdevnotify == NULL) { return FALSE; } return TRUE; }
Get the Symbolic Link of the Device
Enumerate the video devices on the system, as described in Enumerating Video Capture Devices. Choose a device from the list, and then query the activation object for the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK attribute, as shown in the following code.
WCHAR *g_pwszSymbolicLink = NULL;
UINT32 g_cchSymbolicLink = 0;
HRESULT GetSymbolicLink(IMFActivate *pActivate)
{
return pActivate->GetAllocatedString(
MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK,
&g_pwszSymbolicLink,
&g_cchSymbolicLink
);
}
Handle WM_DEVICECHANGE
In your message loop, listen for WM_DEVICECHANGE messages. The lParam message parameter is a pointer to a DEV_BROADCAST_HDR structure.
case WM_DEVICECHANGE: if (lParam != 0) { HRESULT hr = S_OK; BOOL bDeviceLost = FALSE; hr = CheckDeviceLost((PDEV_BROADCAST_HDR)lParam, &bDeviceLost); if (FAILED(hr) || bDeviceLost) { CloseDevice(); MessageBox(hwnd, L"Lost the capture device.", NULL, MB_OK); } } return TRUE;
Next, compare the device notification message against the symbolic link of your device, as follows:
- Check the dbch_devicetype member of the DEV_BROADCAST_HDR structure. If the value is DBT_DEVTYP_DEVICEINTERFACE, cast the structure pointer to a DEV_BROADCAST_DEVICEINTERFACE structure.
- Compare the dbcc_name member of this structure to the symbolic link of the device.
HRESULT CheckDeviceLost(DEV_BROADCAST_HDR *pHdr, BOOL *pbDeviceLost) { DEV_BROADCAST_DEVICEINTERFACE *pDi = NULL; if (pbDeviceLost == NULL) { return E_POINTER; } *pbDeviceLost = FALSE; if (g_pSource == NULL) { return S_OK; } if (pHdr == NULL) { return S_OK; } if (pHdr->dbch_devicetype != DBT_DEVTYP_DEVICEINTERFACE) { return S_OK; } // Compare the device name with the symbolic link. pDi = (DEV_BROADCAST_DEVICEINTERFACE*)pHdr; if (g_pwszSymbolicLink) { if (_wcsicmp(g_pwszSymbolicLink, pDi->dbcc_name) == 0) { *pbDeviceLost = TRUE; } } return S_OK; }
Unregister For Notification
Before the application exits, call UnregisterDeviceNotification to unregister for device notifications/
void OnClose(HWND /*hwnd*/) { if (g_hdevnotify) { UnregisterDeviceNotification(g_hdevnotify); } PostQuitMessage(0); }
Related topics
Media Type Debugging Code
You can use the following code to view the contents of a media type while debugging.
// The following code enables you to view the contents of a media type while // debugging. #include <strsafe.h> LPCWSTR GetGUIDNameConst(const GUID& guid); HRESULT GetGUIDName(const GUID& guid, WCHAR **ppwsz); HRESULT LogAttributeValueByIndex(IMFAttributes *pAttr, DWORD index); HRESULT SpecialCaseAttributeValue(GUID guid, const PROPVARIANT& var); void DBGMSG(PCWSTR format, ...); HRESULT LogMediaType(IMFMediaType *pType) { UINT32 count = 0; HRESULT hr = pType->GetCount(&count); if (FAILED(hr)) { return hr; } if (count == 0) { DBGMSG(L"Empty media type.\n"); } for (UINT32 i = 0; i < count; i++) { hr = LogAttributeValueByIndex(pType, i); if (FAILED(hr)) { break; } } return hr; } HRESULT LogAttributeValueByIndex(IMFAttributes *pAttr, DWORD index) { WCHAR *pGuidName = NULL; WCHAR *pGuidValName = NULL; GUID guid = { 0 }; PROPVARIANT var; PropVariantInit(&var); HRESULT hr = pAttr->GetItemByIndex(index, &guid, &var); if (FAILED(hr)) { goto done; } hr = GetGUIDName(guid, &pGuidName); if (FAILED(hr)) { goto done; } DBGMSG(L"\t%s\t", pGuidName); hr = SpecialCaseAttributeValue(guid, var); if (FAILED(hr)) { goto done; } if (hr == S_FALSE) { switch (var.vt) { case VT_UI4: DBGMSG(L"%d", var.ulVal); break; case VT_UI8: DBGMSG(L"%I64d", var.uhVal); break; case VT_R8: DBGMSG(L"%f", var.dblVal); break; case VT_CLSID: hr = GetGUIDName(*var.puuid, &pGuidValName); if (SUCCEEDED(hr)) { DBGMSG(pGuidValName); } break; case VT_LPWSTR: DBGMSG(var.pwszVal); break; case VT_VECTOR | VT_UI1: DBGMSG(L"<<byte array>>"); break; case VT_UNKNOWN: DBGMSG(L"IUnknown"); break; default: DBGMSG(L"Unexpected attribute type (vt = %d)", var.vt); break; } } done: DBGMSG(L"\n"); CoTaskMemFree(pGuidName); CoTaskMemFree(pGuidValName); PropVariantClear(&var); return hr; } HRESULT GetGUIDName(const GUID& guid, WCHAR **ppwsz) { HRESULT hr = S_OK; WCHAR *pName = NULL; LPCWSTR pcwsz = GetGUIDNameConst(guid); if (pcwsz) { size_t cchLength = 0; hr = StringCchLength(pcwsz, STRSAFE_MAX_CCH, &cchLength); if (FAILED(hr)) { goto done; } pName = (WCHAR*)CoTaskMemAlloc((cchLength + 1) * sizeof(WCHAR)); if (pName == NULL) { hr = E_OUTOFMEMORY; goto done; } hr = StringCchCopy(pName, cchLength + 1, pcwsz); if (FAILED(hr)) { goto done; } } else { hr = StringFromCLSID(guid, &pName); } done: if (FAILED(hr)) { *ppwsz = NULL; CoTaskMemFree(pName); } else { *ppwsz = pName; } return hr; } void LogUINT32AsUINT64(const PROPVARIANT& var) { UINT32 uHigh = 0, uLow = 0; Unpack2UINT32AsUINT64(var.uhVal.QuadPart, &uHigh, &uLow); DBGMSG(L"%d x %d", uHigh, uLow); } float OffsetToFloat(const MFOffset& offset) { return offset.value + (static_cast<float>(offset.fract) / 65536.0f); } HRESULT LogVideoArea(const PROPVARIANT& var) { if (var.caub.cElems < sizeof(MFVideoArea)) { return MF_E_BUFFERTOOSMALL; } MFVideoArea *pArea = (MFVideoArea*)var.caub.pElems; DBGMSG(L"(%f,%f) (%d,%d)", OffsetToFloat(pArea->OffsetX), OffsetToFloat(pArea->OffsetY), pArea->Area.cx, pArea->Area.cy); return S_OK; } // Handle certain known special cases. HRESULT SpecialCaseAttributeValue(GUID guid, const PROPVARIANT& var) { if ((guid == MF_MT_FRAME_RATE) || (guid == MF_MT_FRAME_RATE_RANGE_MAX) || (guid == MF_MT_FRAME_RATE_RANGE_MIN) || (guid == MF_MT_FRAME_SIZE) || (guid == MF_MT_PIXEL_ASPECT_RATIO)) { // Attributes that contain two packed 32-bit values. LogUINT32AsUINT64(var); } else if ((guid == MF_MT_GEOMETRIC_APERTURE) || (guid == MF_MT_MINIMUM_DISPLAY_APERTURE) || (guid == MF_MT_PAN_SCAN_APERTURE)) { // Attributes that an MFVideoArea structure. return LogVideoArea(var); } else { return S_FALSE; } return S_OK; } void DBGMSG(PCWSTR format, ...) { va_list args; va_start(args, format); WCHAR msg[MAX_PATH]; if (SUCCEEDED(StringCbVPrintf(msg, sizeof(msg), format, args))) { OutputDebugString(msg); } } #ifndef IF_EQUAL_RETURN #define IF_EQUAL_RETURN(param, val) if(val == param) return L#val #endif LPCWSTR GetGUIDNameConst(const GUID& guid) { IF_EQUAL_RETURN(guid, MF_MT_MAJOR_TYPE); IF_EQUAL_RETURN(guid, MF_MT_MAJOR_TYPE); IF_EQUAL_RETURN(guid, MF_MT_SUBTYPE); IF_EQUAL_RETURN(guid, MF_MT_ALL_SAMPLES_INDEPENDENT); IF_EQUAL_RETURN(guid, MF_MT_FIXED_SIZE_SAMPLES); IF_EQUAL_RETURN(guid, MF_MT_COMPRESSED); IF_EQUAL_RETURN(guid, MF_MT_SAMPLE_SIZE); IF_EQUAL_RETURN(guid, MF_MT_WRAPPED_TYPE); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_NUM_CHANNELS); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_SAMPLES_PER_SECOND); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_FLOAT_SAMPLES_PER_SECOND); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_AVG_BYTES_PER_SECOND); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_BLOCK_ALIGNMENT); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_BITS_PER_SAMPLE); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_VALID_BITS_PER_SAMPLE); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_SAMPLES_PER_BLOCK); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_CHANNEL_MASK); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_FOLDDOWN_MATRIX); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_PEAKREF); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_PEAKTARGET); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_AVGREF); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_AVGTARGET); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_PREFER_WAVEFORMATEX); IF_EQUAL_RETURN(guid, MF_MT_AAC_PAYLOAD_TYPE); IF_EQUAL_RETURN(guid, MF_MT_AAC_AUDIO_PROFILE_LEVEL_INDICATION); IF_EQUAL_RETURN(guid, MF_MT_FRAME_SIZE); IF_EQUAL_RETURN(guid, MF_MT_FRAME_RATE); IF_EQUAL_RETURN(guid, MF_MT_FRAME_RATE_RANGE_MAX); IF_EQUAL_RETURN(guid, MF_MT_FRAME_RATE_RANGE_MIN); IF_EQUAL_RETURN(guid, MF_MT_PIXEL_ASPECT_RATIO); IF_EQUAL_RETURN(guid, MF_MT_DRM_FLAGS); IF_EQUAL_RETURN(guid, MF_MT_PAD_CONTROL_FLAGS); IF_EQUAL_RETURN(guid, MF_MT_SOURCE_CONTENT_HINT); IF_EQUAL_RETURN(guid, MF_MT_VIDEO_CHROMA_SITING); IF_EQUAL_RETURN(guid, MF_MT_INTERLACE_MODE); IF_EQUAL_RETURN(guid, MF_MT_TRANSFER_FUNCTION); IF_EQUAL_RETURN(guid, MF_MT_VIDEO_PRIMARIES); IF_EQUAL_RETURN(guid, MF_MT_CUSTOM_VIDEO_PRIMARIES); IF_EQUAL_RETURN(guid, MF_MT_YUV_MATRIX); IF_EQUAL_RETURN(guid, MF_MT_VIDEO_LIGHTING); IF_EQUAL_RETURN(guid, MF_MT_VIDEO_NOMINAL_RANGE); IF_EQUAL_RETURN(guid, MF_MT_GEOMETRIC_APERTURE); IF_EQUAL_RETURN(guid, MF_MT_MINIMUM_DISPLAY_APERTURE); IF_EQUAL_RETURN(guid, MF_MT_PAN_SCAN_APERTURE); IF_EQUAL_RETURN(guid, MF_MT_PAN_SCAN_ENABLED); IF_EQUAL_RETURN(guid, MF_MT_AVG_BITRATE); IF_EQUAL_RETURN(guid, MF_MT_AVG_BIT_ERROR_RATE); IF_EQUAL_RETURN(guid, MF_MT_MAX_KEYFRAME_SPACING); IF_EQUAL_RETURN(guid, MF_MT_DEFAULT_STRIDE); IF_EQUAL_RETURN(guid, MF_MT_PALETTE); IF_EQUAL_RETURN(guid, MF_MT_USER_DATA); IF_EQUAL_RETURN(guid, MF_MT_AM_FORMAT_TYPE); IF_EQUAL_RETURN(guid, MF_MT_MPEG_START_TIME_CODE); IF_EQUAL_RETURN(guid, MF_MT_MPEG2_PROFILE); IF_EQUAL_RETURN(guid, MF_MT_MPEG2_LEVEL); IF_EQUAL_RETURN(guid, MF_MT_MPEG2_FLAGS); IF_EQUAL_RETURN(guid, MF_MT_MPEG_SEQUENCE_HEADER); IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_SRC_PACK_0); IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_CTRL_PACK_0); IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_SRC_PACK_1); IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_CTRL_PACK_1); IF_EQUAL_RETURN(guid, MF_MT_DV_VAUX_SRC_PACK); IF_EQUAL_RETURN(guid, MF_MT_DV_VAUX_CTRL_PACK); IF_EQUAL_RETURN(guid, MF_MT_ARBITRARY_HEADER); IF_EQUAL_RETURN(guid, MF_MT_ARBITRARY_FORMAT); IF_EQUAL_RETURN(guid, MF_MT_IMAGE_LOSS_TOLERANT); IF_EQUAL_RETURN(guid, MF_MT_MPEG4_SAMPLE_DESCRIPTION); IF_EQUAL_RETURN(guid, MF_MT_MPEG4_CURRENT_SAMPLE_ENTRY); IF_EQUAL_RETURN(guid, MF_MT_ORIGINAL_4CC); IF_EQUAL_RETURN(guid, MF_MT_ORIGINAL_WAVE_FORMAT_TAG); // Media types IF_EQUAL_RETURN(guid, MFMediaType_Audio); IF_EQUAL_RETURN(guid, MFMediaType_Video); IF_EQUAL_RETURN(guid, MFMediaType_Protected); IF_EQUAL_RETURN(guid, MFMediaType_SAMI); IF_EQUAL_RETURN(guid, MFMediaType_Script); IF_EQUAL_RETURN(guid, MFMediaType_Image); IF_EQUAL_RETURN(guid, MFMediaType_HTML); IF_EQUAL_RETURN(guid, MFMediaType_Binary); IF_EQUAL_RETURN(guid, MFMediaType_FileTransfer); IF_EQUAL_RETURN(guid, MFVideoFormat_AI44); // FCC('AI44') IF_EQUAL_RETURN(guid, MFVideoFormat_ARGB32); // D3DFMT_A8R8G8B8 IF_EQUAL_RETURN(guid, MFVideoFormat_AYUV); // FCC('AYUV') IF_EQUAL_RETURN(guid, MFVideoFormat_DV25); // FCC('dv25') IF_EQUAL_RETURN(guid, MFVideoFormat_DV50); // FCC('dv50') IF_EQUAL_RETURN(guid, MFVideoFormat_DVH1); // FCC('dvh1') IF_EQUAL_RETURN(guid, MFVideoFormat_DVSD); // FCC('dvsd') IF_EQUAL_RETURN(guid, MFVideoFormat_DVSL); // FCC('dvsl') IF_EQUAL_RETURN(guid, MFVideoFormat_H264); // FCC('H264') IF_EQUAL_RETURN(guid, MFVideoFormat_I420); // FCC('I420') IF_EQUAL_RETURN(guid, MFVideoFormat_IYUV); // FCC('IYUV') IF_EQUAL_RETURN(guid, MFVideoFormat_M4S2); // FCC('M4S2') IF_EQUAL_RETURN(guid, MFVideoFormat_MJPG); IF_EQUAL_RETURN(guid, MFVideoFormat_MP43); // FCC('MP43') IF_EQUAL_RETURN(guid, MFVideoFormat_MP4S); // FCC('MP4S') IF_EQUAL_RETURN(guid, MFVideoFormat_MP4V); // FCC('MP4V') IF_EQUAL_RETURN(guid, MFVideoFormat_MPG1); // FCC('MPG1') IF_EQUAL_RETURN(guid, MFVideoFormat_MSS1); // FCC('MSS1') IF_EQUAL_RETURN(guid, MFVideoFormat_MSS2); // FCC('MSS2') IF_EQUAL_RETURN(guid, MFVideoFormat_NV11); // FCC('NV11') IF_EQUAL_RETURN(guid, MFVideoFormat_NV12); // FCC('NV12') IF_EQUAL_RETURN(guid, MFVideoFormat_P010); // FCC('P010') IF_EQUAL_RETURN(guid, MFVideoFormat_P016); // FCC('P016') IF_EQUAL_RETURN(guid, MFVideoFormat_P210); // FCC('P210') IF_EQUAL_RETURN(guid, MFVideoFormat_P216); // FCC('P216') IF_EQUAL_RETURN(guid, MFVideoFormat_RGB24); // D3DFMT_R8G8B8 IF_EQUAL_RETURN(guid, MFVideoFormat_RGB32); // D3DFMT_X8R8G8B8 IF_EQUAL_RETURN(guid, MFVideoFormat_RGB555); // D3DFMT_X1R5G5B5 IF_EQUAL_RETURN(guid, MFVideoFormat_RGB565); // D3DFMT_R5G6B5 IF_EQUAL_RETURN(guid, MFVideoFormat_RGB8); IF_EQUAL_RETURN(guid, MFVideoFormat_UYVY); // FCC('UYVY') IF_EQUAL_RETURN(guid, MFVideoFormat_v210); // FCC('v210') IF_EQUAL_RETURN(guid, MFVideoFormat_v410); // FCC('v410') IF_EQUAL_RETURN(guid, MFVideoFormat_WMV1); // FCC('WMV1') IF_EQUAL_RETURN(guid, MFVideoFormat_WMV2); // FCC('WMV2') IF_EQUAL_RETURN(guid, MFVideoFormat_WMV3); // FCC('WMV3') IF_EQUAL_RETURN(guid, MFVideoFormat_WVC1); // FCC('WVC1') IF_EQUAL_RETURN(guid, MFVideoFormat_Y210); // FCC('Y210') IF_EQUAL_RETURN(guid, MFVideoFormat_Y216); // FCC('Y216') IF_EQUAL_RETURN(guid, MFVideoFormat_Y410); // FCC('Y410') IF_EQUAL_RETURN(guid, MFVideoFormat_Y416); // FCC('Y416') IF_EQUAL_RETURN(guid, MFVideoFormat_Y41P); IF_EQUAL_RETURN(guid, MFVideoFormat_Y41T); IF_EQUAL_RETURN(guid, MFVideoFormat_YUY2); // FCC('YUY2') IF_EQUAL_RETURN(guid, MFVideoFormat_YV12); // FCC('YV12') IF_EQUAL_RETURN(guid, MFVideoFormat_YVYU); IF_EQUAL_RETURN(guid, MFAudioFormat_PCM); // WAVE_FORMAT_PCM IF_EQUAL_RETURN(guid, MFAudioFormat_Float); // WAVE_FORMAT_IEEE_FLOAT IF_EQUAL_RETURN(guid, MFAudioFormat_DTS); // WAVE_FORMAT_DTS IF_EQUAL_RETURN(guid, MFAudioFormat_Dolby_AC3_SPDIF); // WAVE_FORMAT_DOLBY_AC3_SPDIF IF_EQUAL_RETURN(guid, MFAudioFormat_DRM); // WAVE_FORMAT_DRM IF_EQUAL_RETURN(guid, MFAudioFormat_WMAudioV8); // WAVE_FORMAT_WMAUDIO2 IF_EQUAL_RETURN(guid, MFAudioFormat_WMAudioV9); // WAVE_FORMAT_WMAUDIO3 IF_EQUAL_RETURN(guid, MFAudioFormat_WMAudio_Lossless); // WAVE_FORMAT_WMAUDIO_LOSSLESS IF_EQUAL_RETURN(guid, MFAudioFormat_WMASPDIF); // WAVE_FORMAT_WMASPDIF IF_EQUAL_RETURN(guid, MFAudioFormat_MSP1); // WAVE_FORMAT_WMAVOICE9 IF_EQUAL_RETURN(guid, MFAudioFormat_MP3); // WAVE_FORMAT_MPEGLAYER3 IF_EQUAL_RETURN(guid, MFAudioFormat_MPEG); // WAVE_FORMAT_MPEG IF_EQUAL_RETURN(guid, MFAudioFormat_AAC); // WAVE_FORMAT_MPEG_HEAAC IF_EQUAL_RETURN(guid, MFAudioFormat_ADTS); // WAVE_FORMAT_MPEG_ADTS_AAC return NULL; }