Hauppauge PVR-250 / 350 Driver

John-Mark Gurney gurney_j at efn.org
Thu Dec 4 13:18:26 PST 2003


Andrew Gordon wrote this message on Thu, Dec 04, 2003 at 00:37 +0000:
> > > This isn't really the case for the PVR-250 / 350.  The MPEG program
> > > stream from the card contains both sound and video.
> >
> > Yeh, And I did say most.. :)  I was thinking that if the card does
> > something special with sound, like integrate it into a codec stream,
> > then the sound capture could/should? be set as a parameter of that
> > device that does the integration...  Because I assume that the card
> > can only captuer sound as part of the mpeg codec stream? (going to
> > check the specs).
> >
> > Do people think that sound should be brought under it's umbrela?
> > I don't want to delay it too much by expanding the scope..  Just getting
> > video is going to be difficult enough..
> 
> Well, some means of supporting sound alongside video is certainly
> required: if the API doesn't handle sound explicitly, then it needs to
> convey the necessary timing information so that the sound can be handled
> via separate APIs without losing synchronisation.

Well, if the sound is completely encoded in the codec stream, then
videobus would not know about it...  I do plan on doing kernel side
timestamp tagging of frames so that the upper layers will be able to
get this info.  The only problem with this is that the timestamps may
not agree with the sound card unless the soundcard/code also does time
stamping since the local clock could run (and often do) at different
rates than the sound sampling code...

> For many kinds of device, all you need is to report the delay through the
> device (and it's associated driver buffering).  However, some kinds of
> device don't have constant delay and timestamps are required.  Devices
> such as the MPEG capture card mentioned earlier are a problem to fit into
> such a scheme, as although the video will come out nicely timestamped (and
> the audio likewise), there may not be a means to relate those timestamps
> to real-time (ie. you can achieve A/V-sync between A and V streams
> captured on the card as designed, but it's hard to synchronize with sound
> captured on a standard sound-card at the same time, for example).

Yeh, It's a sticky problem that I don't want to completely ignore it,
but from my point of view, about the only thing I can do is to timestamp
against the local clock, anymore requires significant fingers in each
subsystem's pie to be doable...

-- 
  John-Mark Gurney				Voice: +1 415 225 5579

     "All that I will do, has been done, All that I have, has not."


More information about the freebsd-multimedia mailing list