PushSource Source Filter

PushSource Source Filter

The PushSource source filter example generates a stream of black video frames (in RGB32 format) that have yellow frame numbers printed on them. The PushSource filter generates frames at a rate of 30 per second at standard video speed, and it will keep going for 10 seconds (the value set by the DEFAULT_DURATION constant) for a total of 300 frames. In operation, the PushSource filter which appears in the list of DirectShow filters under the name Generic Push Source looks like Figure 12-5.

figure 12-5 the pushsource filter creating a video stream with numbered frames

Figure 12-5. The PushSource filter creating a video stream with numbered frames

The DirectShow C++ base classes include three classes that are instrumental in implementing PushSource: CSource, which implements a basic source filter; CSourceStream, which implements the presentation of a stream on the source filter s output pin; and CSourceSeeking, which implements the seek command on the output pin of the source filter. If your source filter has more than one output pin (some do, most don t), you can t use CSourceSeeking as a basis for an implementation of the seek command. In that case, you d need to coordinate between the two pins, which requires careful thread synchronization that CSourceSeeking doesn t handle. (You can use CSourceStream for multiple output pins.) In a simple source filter, such as PushSource, only a minimal implementation is required and nearly all of that is in the implementation of methods in CSourceStream and CSourceSeeking.

Implementing the Source Filter Class

The implementation of the source filter class CPushSource for the PushSource example is entirely straightforward. Only two methods need to be overridden: the class constructor, which creates the output pin, and CPushSource::Create Instance, which calls the class constructor to create an instance of the filter. Here s the complete implementation of CPushSource:

// CPushSource class: Our source filter. class CPushSource : public CSource { private: // Constructor is private // You have to use CreateInstance to create it. CPushSource(IUnknown *pUnk, HRESULT *phr); public: static CUnknown * WINAPI CreateInstance(IUnknown *pUnk, HRESULT *phr); }; CPushSource::CPushSource(IUnknown *pUnk, HRESULT *phr) : CSource(NAME("PushSource"), pUnk, CLSID_PushSource) { // Create the output pin. // The pin magically adds itself to the pin array. CPushPin *pPin = new CPushPin(phr, this); if (pPin == NULL) { *phr = E_OUTOFMEMORY; } } CUnknown * WINAPI CPushSource::CreateInstance(IUnknown *pUnk, HRESULT *phr) { CPushSource *pNewFilter = new CPushSource(pUnk, phr ); if (pNewFilter == NULL) { *phr = E_OUTOFMEMORY; } return pNewFilter; }

This is all the code required to create the source filter object.

Implementing the Source Filter Output Pin Class

All the real work in the PushSource filter takes place inside the filter s output pin. The output pin class, CPushPin, is declared as a descendant of both the CSourceStream and CSourceSeeking classes, as shown in the following code:

class CPushPin : public CSourceStream, public CSourceSeeking { private: REFERENCE_TIME m_rtStreamTime; // Stream time (relative to // when the graph started) REFERENCE_TIME m_rtSourceTime; // Source time (relative to // ourselves) // A note about seeking and time stamps: // Suppose you have a file source that is N seconds long. // If you play the file from // the beginning at normal playback rate (1x), // the presentation time for each frame // will match the source file: // Frame: 0 1 2 ... N // Time: 0 1 2 ... N // Now suppose you seek in the file to an arbitrary spot, // frame i. After the seek // command, the graph's clock resets to zero. // Therefore, the first frame delivered // after the seek has a presentation time of zero: // Frame: i i+1 i+2 ... // Time: 0 1 2 ... // Therefore we have to track // stream time and source time independently. // (If you do not support seeking, // then source time always equals presentation time.) REFERENCE_TIME m_rtFrameLength; // Frame length int m_iFrameNumber; // Current frame number that we are rendering BOOL m_bDiscontinuity; // If true, set the discontinuity flag CCritSec m_cSharedState; // Protects our internal state ULONG_PTR m_gdiplusToken; // GDI+ initialization token // Private function to draw our bitmaps. HRESULT WriteToBuffer(LPWSTR wszText, BYTE *pData, VIDEOINFOHEADER *pVih); // Update our internal state after a seek command. void UpdateFromSeek(); // The following methods support seeking // using other time formats besides // reference time. If you want to support only // seek-by-reference-time, you // do not have to override these methods. STDMETHODIMP SetTimeFormat(const GUID *pFormat); STDMETHODIMP GetTimeFormat(GUID *pFormat); STDMETHODIMP IsUsingTimeFormat(const GUID *pFormat); STDMETHODIMP IsFormatSupported(const GUID *pFormat); STDMETHODIMP QueryPreferredFormat(GUID *pFormat); STDMETHODIMP ConvertTimeFormat(LONGLONG *pTarget, const GUID *pTargetFormat, LONGLONG Source, const GUID *pSourceFormat ); STDMETHODIMP SetPositions(LONGLONG *pCurrent, DWORD CurrentFlags, LONGLONG *pStop, DWORD StopFlags); STDMETHODIMP GetDuration(LONGLONG *pDuration); STDMETHODIMP GetStopPosition(LONGLONG *pStop); // Conversions between reference times and frame numbers. LONGLONG FrameToTime(LONGLONG frame) { LONGLONG f = frame * m_rtFrameLength; return f; } LONGLONG TimeToFrame(LONGLONG rt) { return rt / m_rtFrameLength; } GUID m_TimeFormat; // Which time format is currently active protected: // Override CSourceStream methods. HRESULT GetMediaType(CMediaType *pMediaType); HRESULT CheckMediaType(const CMediaType *pMediaType); HRESULT DecideBufferSize(IMemAllocator *pAlloc, ALLOCATOR_PROPERTIES *pRequest); HRESULT FillBuffer(IMediaSample *pSample); // The following methods support seeking. HRESULT OnThreadStartPlay(); HRESULT ChangeStart(); HRESULT ChangeStop(); HRESULT ChangeRate(); STDMETHODIMP SetRate(double dRate); public: CPushPin(HRESULT *phr, CSource *pFilter); ~CPushPin(); // Override this to expose ImediaSeeking. STDMETHODIMP NonDelegatingQueryInterface(REFIID riid, void **ppv); // We don't support any quality control. STDMETHODIMP Notify(IBaseFilter *pSelf, Quality q) { return E_FAIL; } };

Let s begin with an examination of the three public methods defined in CPushPin and implemented in the following code: the class constructor, the destructor, and CPushPin::NonDelegatingQueryInterface.

CPushPin::CPushPin(HRESULT *phr, CSource *pFilter) : CSourceStream(NAME("CPushPin"), phr, pFilter, L"Out"), CSourceSeeking(NAME("PushPin2Seek"), (IPin*)this, phr, &m_cSharedState), m_rtStreamTime(0), m_rtSourceTime(0), m_iFrameNumber(0), m_rtFrameLength(Fps2FrameLength(DEFAULT_FRAME_RATE)) { // Initialize GDI+. GdiplusStartupInput gdiplusStartupInput; Status s = GdiplusStartup(&m_gdiplusToken, &gdiplusStartupInput, NULL); if (s != Ok) { *phr = E_FAIL; } // SEEKING: Set the source duration and the initial stop time. m_rtDuration = m_rtStop = DEFAULT_DURATION; } CPushPin::~CPushPin() { // Shut down GDI+. GdiplusShutdown(m_gdiplusToken); } STDMETHODIMP CPushPin::NonDelegatingQueryInterface(REFIID riid, void **ppv) { if( riid == IID_IMediaSeeking ) { return CSourceSeeking::NonDelegatingQueryInterface( riid, ppv ); } return CSourceStream::NonDelegatingQueryInterface(riid, ppv); }

The class constructor invokes the constructors for the two ancestor classes of CPushPin, CSourceStream and CSourceSeeking, and then the GDI+ (Microsoft s 2D graphics library) is initialized. GDI+ will create the numbered video frames streamed over the pin. The class destructor shuts down GDI+, and CPushPin::NonDelegatingQueryInterface determines whether a QueryInterface call issued to the PushSource filter object is handled by CSourceStream or CSourceSeeking. Because only seek commands are handled by CSourceSeeking, only those commands are passed to that implementation.

Negotiating Connections on a Source Filter

CPushPin has two methods to handle the media type negotiation, CPushPin::GetMediaType and CPushPin::CheckMediaType. These methods, together with the method CPushPin::DecideBufferSize, handle pin-to-pin connection negotiation, as implemented in the following code:

HRESULT CPushPin::GetMediaType(CMediaType *pMediaType) { CheckPointer(pMediaType, E_POINTER); CAutoLock cAutoLock(m_pFilter->pStateLock()); // Call our helper function that fills in the media type. return CreateRGBVideoType(pMediaType, 32, DEFAULT_WIDTH, DEFAULT_HEIGHT, DEFAULT_FRAME_RATE); } HRESULT CPushPin::CheckMediaType(const CMediaType *pMediaType) { CAutoLock lock(m_pFilter->pStateLock()); // Is it a video type? if (pMediaType->majortype != MEDIATYPE_Video) { return E_FAIL; } // Is it 32-bit RGB? if ((pMediaType->subtype != MEDIASUBTYPE_RGB32)) { return E_FAIL; } // Is it a VIDEOINFOHEADER type? if ((pMediaType->formattype == FORMAT_VideoInfo) && (pMediaType->cbFormat >= sizeof(VIDEOINFOHEADER)) && (pMediaType->pbFormat != NULL)) { VIDEOINFOHEADER *pVih = (VIDEOINFOHEADER*)pMediaType->pbFormat; // We don't do source rects. if (!IsRectEmpty(&(pVih->rcSource))) { return E_FAIL; } // Valid frame rate? if (pVih->AvgTimePerFrame != m_rtFrameLength) { return E_FAIL; } // Everything checked out. return S_OK; } return E_FAIL; } HRESULT CPushPin::DecideBufferSize(IMemAllocator *pAlloc, ALLOCATOR_PROPERTIES *pRequest) { CAutoLock cAutoLock(m_pFilter->pStateLock()); HRESULT hr; VIDEOINFO *pvi = (VIDEOINFO*) m_mt.Format(); ASSERT(pvi != NULL); if (pRequest->cBuffers == 0) { pRequest->cBuffers = 1; // We need at least one buffer } // Buffer size must be at least big enough to hold our image. if ((long)pvi->bmiHeader.biSizeImage > pRequest->cbBuffer) { pRequest->cbBuffer = pvi->bmiHeader.biSizeImage; } // Try to set these properties. ALLOCATOR_PROPERTIES Actual; hr = pAlloc->SetProperties(pRequest, &Actual); if (FAILED(hr)) { return hr; } // Check what we actually got. if (Actual.cbBuffer < pRequest->cbBuffer) { return E_FAIL; } return S_OK; }

All three methods begin with the inline creation of a CAutoLock object, a class DirectShow uses to lock a critical section of code. The lock is released when CAutoLock s destructor is called in this case, when the object goes out of scope when the method exits. This is a clever and simple way to implement critical sections of code. CPushPin::GetMediaType always returns a media type for RGB32 video because that s the only type it supports. The media type is built in a call to CreateRGBVideoType, a helper function that can be easily rewritten to handle the creation of a broader array of media types. CPushPin::CheckMediaType ensures that any presented media type exactly matches the RGB32 type supported by the pin; any deviation from the one media type supported by the pin results in failure. Finally, CPushPin::DecideBufferSize examines the size of each sample (in this case, a video frame) and requests at least one buffer of that size. The method will accept more storage if offered but will settle for no less than that minimum amount. These three methods must be implemented by every source filter.

Implementing Media Sample Creation Methods

One media sample creation method must be implemented with CPushPin. This is the method by which video frames are created and passed along as media samples. When the streaming thread is created, it goes into a loop, and CPushPin::FillBuffer is called every time through the loop as long as the filter is active (either paused or running). In CPushPin, this method is combined with another, internal, method, CPushPin::WriteToBuffer, which actually creates the image, as shown in the following code:

HRESULT CPushPin::FillBuffer(IMediaSample *pSample) { HRESULT hr; BYTE *pData; long cbData; WCHAR msg[256]; // Get a pointer to the buffer. pSample->GetPointer(&pData); cbData = pSample->GetSize(); // Check if the downstream filter is changing the format. CMediaType *pmt; hr = pSample->GetMediaType((AM_MEDIA_TYPE**)&pmt); if (hr == S_OK) { SetMediaType(pmt); DeleteMediaType(pmt); } // Get our format information ASSERT(m_mt.formattype == FORMAT_VideoInfo); ASSERT(m_mt.cbFormat >= sizeof(VIDEOINFOHEADER)); VIDEOINFOHEADER *pVih = (VIDEOINFOHEADER*)m_mt.pbFormat; { // Scope for the state lock, // which protects the frame number and ref times. CAutoLock cAutoLockShared(&m_cSharedState); // Have we reached the stop time yet? if (m_rtSourceTime >= m_rtStop) { // This will cause the base class // to send an EndOfStream downstream. return S_FALSE; } // Time stamp the sample. REFERENCE_TIME rtStart, rtStop; rtStart = (REFERENCE_TIME)(m_rtStreamTime / m_dRateSeeking); rtStop = rtStart + (REFERENCE_TIME)(pVih->AvgTimePerFrame / m_dRateSeeking); pSample->SetTime(&rtStart, &rtStop); // Write the frame number into our text buffer. swprintf(msg, L"%d", m_iFrameNumber); // Increment the frame number // and ref times for the next time through the loop. m_iFrameNumber++; m_rtSourceTime += pVih->AvgTimePerFrame; m_rtStreamTime += pVih->AvgTimePerFrame; } // Private method to draw the image. hr = WriteToBuffer(msg, pData, pVih); if (FAILED(hr)) { return hr; } // Every frame is a key frame. pSample->SetSyncPoint(TRUE); return S_OK; } HRESULT CPushPin::WriteToBuffer(LPWSTR wszText, BYTE *pData, VIDEOINFOHEADER *pVih) { ASSERT(pVih->bmiHeader.biBitCount == 32); DWORD dwWidth, dwHeight; long lStride; BYTE *pbTop; // Get the width, height, top row of pixels, and stride. GetVideoInfoParameters(pVih, pData, &dwWidth, &dwHeight, &lStride, &pbTop, false); // Create a GDI+ bitmap object to manage our image buffer. Bitmap bitmap((int)dwWidth, (int)dwHeight, abs(lStride), PixelFormat32bppRGB, pData); // Create a GDI+ graphics object to manage the drawing. Graphics g(&bitmap); // Turn on anti-aliasing. g.SetSmoothingMode(SmoothingModeAntiAlias); g.SetTextRenderingHint(TextRenderingHintAntiAlias); // Erase the background. g.Clear(Color(0x0, 0, 0, 0)); // GDI+ is top-down by default, // so if our image format is bottom-up, we need // to set a transform on the Graphics object to flip the image. if (pVih->bmiHeader.biHeight > 0) { // Flip the image around the X axis. g.ScaleTransform(1.0, -1.0); // Move it back into place. g.TranslateTransform(0, (float)dwHeight, MatrixOrderAppend); } SolidBrush brush(Color(0xFF, 0xFF, 0xFF, 0)); // Yellow brush Font font(FontFamily::GenericSerif(), 48); // Big serif type RectF rcBounds(30, 30, (float)dwWidth, (float)dwHeight); // Bounding rectangle // Draw the text g.DrawString( wszText, -1, &font, rcBounds, StringFormat::GenericDefault(), &brush); return S_OK; }

Upon entering CPushPin::FillBuffer, the media sample is examined for a valid media type. Then a thread lock is executed, and the method determines whether the source filter s stop time has been reached. If it has, the routine returns S_FALSE, indicating that a media sample will not be generated by the method. This will cause the downstream buffers to empty, leaving filters waiting for media samples from the source filter.

If the filter stop time has not been reached, the sample s timestamp is set with a call to IMediaSample::SetTime. This timestamp is composed of two pieces (both in REFERENCE_TIME format of 100-nanosecond intervals), indicating the sample s start time and stop time. The calculation here creates a series of timestamps separated by the time per frame (in this case, one thirtieth of a second). At this point, the frame number is written to a string stored in the object and then incremented.

That frame number is put to work in CPushPin::WriteToBuffer, which creates the RGB32-based video frame using GDI+. The method makes a call to the helper function GetVideoInfoParameters, retrieving the video parameters it uses to create a bitmap of appropriate width and height. With a few GDI+ calls, CPushPin::WriteToBuffer clears the background (to black) and then inverts the drawing image if the video format is bottom-up rather than top-down (the default for GDI+). A brush and then a font are selected, and the frame number is drawn to the bitmap inside a bounding rectangle.



Programming Microsoft DirectShow for Digital Video and Television
Programming Microsoft DirectShow for Digital Video and Television (Pro-Developer)
ISBN: 0735618216
EAN: 2147483647
Year: 2002
Pages: 108
Authors: Mark D. Pesce

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net