LCOV - code coverage report
Current view: top level - dom/media - MediaDecoderStateMachine.h (source / functions) Hit Total Coverage
Test: output.info Lines: 0 55 0.0 %
Date: 2017-07-14 16:53:18 Functions: 0 32 0.0 %
Legend: Lines: hit not hit

          Line data    Source code
       1             : /* -*- Mode: C++; tab-width: 2; indent-tabs-mode: nil; c-basic-offset: 2 -*- */
       2             : /* vim:set ts=2 sw=2 sts=2 et cindent: */
       3             : /* This Source Code Form is subject to the terms of the Mozilla Public
       4             :  * License, v. 2.0. If a copy of the MPL was not distributed with this
       5             :  * file, You can obtain one at http://mozilla.org/MPL/2.0/. */
       6             : /*
       7             : 
       8             : Each media element for a media file has one thread called the "audio thread".
       9             : 
      10             : The audio thread  writes the decoded audio data to the audio
      11             : hardware. This is done in a separate thread to ensure that the
      12             : audio hardware gets a constant stream of data without
      13             : interruption due to decoding or display. At some point
      14             : AudioStream will be refactored to have a callback interface
      15             : where it asks for data and this thread will no longer be
      16             : needed.
      17             : 
      18             : The element/state machine also has a TaskQueue which runs in a
      19             : SharedThreadPool that is shared with all other elements/decoders. The state
      20             : machine dispatches tasks to this to call into the MediaDecoderReader to
      21             : request decoded audio or video data. The Reader will callback with decoded
      22             : sampled when it has them available, and the state machine places the decoded
      23             : samples into its queues for the consuming threads to pull from.
      24             : 
      25             : The MediaDecoderReader can choose to decode asynchronously, or synchronously
      26             : and return requested samples synchronously inside it's Request*Data()
      27             : functions via callback. Asynchronous decoding is preferred, and should be
      28             : used for any new readers.
      29             : 
      30             : Synchronisation of state between the thread is done via a monitor owned
      31             : by MediaDecoder.
      32             : 
      33             : The lifetime of the audio thread is controlled by the state machine when
      34             : it runs on the shared state machine thread. When playback needs to occur
      35             : the audio thread is created and an event dispatched to run it. The audio
      36             : thread exits when audio playback is completed or no longer required.
      37             : 
      38             : A/V synchronisation is handled by the state machine. It examines the audio
      39             : playback time and compares this to the next frame in the queue of video
      40             : frames. If it is time to play the video frame it is then displayed, otherwise
      41             : it schedules the state machine to run again at the time of the next frame.
      42             : 
      43             : Frame skipping is done in the following ways:
      44             : 
      45             :   1) The state machine will skip all frames in the video queue whose
      46             :      display time is less than the current audio time. This ensures
      47             :      the correct frame for the current time is always displayed.
      48             : 
      49             :   2) The decode tasks will stop decoding interframes and read to the
      50             :      next keyframe if it determines that decoding the remaining
      51             :      interframes will cause playback issues. It detects this by:
      52             :        a) If the amount of audio data in the audio queue drops
      53             :           below a threshold whereby audio may start to skip.
      54             :        b) If the video queue drops below a threshold where it
      55             :           will be decoding video data that won't be displayed due
      56             :           to the decode thread dropping the frame immediately.
      57             :      TODO: In future we should only do this when the Reader is decoding
      58             :            synchronously.
      59             : 
      60             : When hardware accelerated graphics is not available, YCbCr conversion
      61             : is done on the decode task queue when video frames are decoded.
      62             : 
      63             : The decode task queue pushes decoded audio and videos frames into two
      64             : separate queues - one for audio and one for video. These are kept
      65             : separate to make it easy to constantly feed audio data to the audio
      66             : hardware while allowing frame skipping of video data. These queues are
      67             : threadsafe, and neither the decode, audio, or state machine should
      68             : be able to monopolize them, and cause starvation of the other threads.
      69             : 
      70             : Both queues are bounded by a maximum size. When this size is reached
      71             : the decode tasks will no longer request video or audio depending on the
      72             : queue that has reached the threshold. If both queues are full, no more
      73             : decode tasks will be dispatched to the decode task queue, so other
      74             : decoders will have an opportunity to run.
      75             : 
      76             : During playback the audio thread will be idle (via a Wait() on the
      77             : monitor) if the audio queue is empty. Otherwise it constantly pops
      78             : audio data off the queue and plays it with a blocking write to the audio
      79             : hardware (via AudioStream).
      80             : 
      81             : */
      82             : #if !defined(MediaDecoderStateMachine_h__)
      83             : #define MediaDecoderStateMachine_h__
      84             : 
      85             : #include "mozilla/Attributes.h"
      86             : #include "mozilla/ReentrantMonitor.h"
      87             : #include "mozilla/StateMirroring.h"
      88             : 
      89             : #include "nsAutoPtr.h"
      90             : #include "nsThreadUtils.h"
      91             : #include "MediaDecoder.h"
      92             : #include "MediaDecoderReader.h"
      93             : #include "MediaDecoderOwner.h"
      94             : #include "MediaEventSource.h"
      95             : #include "MediaMetadataManager.h"
      96             : #include "MediaStatistics.h"
      97             : #include "MediaTimer.h"
      98             : #include "ImageContainer.h"
      99             : #include "SeekJob.h"
     100             : 
     101             : namespace mozilla {
     102             : 
     103             : namespace media {
     104             : class MediaSink;
     105             : }
     106             : 
     107             : class AbstractThread;
     108             : class AudioSegment;
     109             : class DecodedStream;
     110             : class MediaDecoderReaderWrapper;
     111             : class OutputStreamManager;
     112             : class TaskQueue;
     113             : 
     114             : extern LazyLogModule gMediaDecoderLog;
     115             : 
     116             : enum class MediaEventType : int8_t
     117             : {
     118             :   PlaybackStarted,
     119             :   PlaybackStopped,
     120             :   PlaybackEnded,
     121             :   SeekStarted,
     122             :   Invalidate,
     123             :   EnterVideoSuspend,
     124             :   ExitVideoSuspend,
     125             :   StartVideoSuspendTimer,
     126             :   CancelVideoSuspendTimer,
     127             :   VideoOnlySeekBegin,
     128             :   VideoOnlySeekCompleted,
     129             : };
     130             : 
     131             : enum class VideoDecodeMode : uint8_t
     132             : {
     133             :   Normal,
     134             :   Suspend
     135             : };
     136             : 
     137             : /*
     138             :   The state machine class. This manages the decoding and seeking in the
     139             :   MediaDecoderReader on the decode task queue, and A/V sync on the shared
     140             :   state machine thread, and controls the audio "push" thread.
     141             : 
     142             :   All internal state is synchronised via the decoder monitor. State changes
     143             :   are propagated by scheduling the state machine to run another cycle on the
     144             :   shared state machine thread.
     145             : 
     146             :   See MediaDecoder.h for more details.
     147             : */
     148             : class MediaDecoderStateMachine
     149             : {
     150           0 :   NS_INLINE_DECL_THREADSAFE_REFCOUNTING(MediaDecoderStateMachine)
     151             : 
     152             :   using TrackSet = MediaDecoderReader::TrackSet;
     153             : 
     154             : public:
     155             :   typedef MediaDecoderOwner::NextFrameStatus NextFrameStatus;
     156             :   typedef mozilla::layers::ImageContainer::FrameID FrameID;
     157             :   MediaDecoderStateMachine(MediaDecoder* aDecoder,
     158             :                            MediaDecoderReader* aReader);
     159             : 
     160             :   nsresult Init(MediaDecoder* aDecoder);
     161             : 
     162             :   // Enumeration for the valid decoding states
     163             :   enum State
     164             :   {
     165             :     DECODER_STATE_DECODING_METADATA,
     166             :     DECODER_STATE_WAIT_FOR_CDM,
     167             :     DECODER_STATE_DORMANT,
     168             :     DECODER_STATE_DECODING_FIRSTFRAME,
     169             :     DECODER_STATE_DECODING,
     170             :     DECODER_STATE_SEEKING,
     171             :     DECODER_STATE_BUFFERING,
     172             :     DECODER_STATE_COMPLETED,
     173             :     DECODER_STATE_SHUTDOWN
     174             :   };
     175             : 
     176             :   RefPtr<MediaDecoder::DebugInfoPromise> RequestDebugInfo();
     177             : 
     178             :   void AddOutputStream(ProcessedMediaStream* aStream, bool aFinishWhenEnded);
     179             :   // Remove an output stream added with AddOutputStream.
     180             :   void RemoveOutputStream(MediaStream* aStream);
     181             : 
     182             :   // Seeks to the decoder to aTarget asynchronously.
     183             :   RefPtr<MediaDecoder::SeekPromise> InvokeSeek(const SeekTarget& aTarget);
     184             : 
     185           0 :   void DispatchSetPlaybackRate(double aPlaybackRate)
     186             :   {
     187           0 :     OwnerThread()->DispatchStateChange(
     188           0 :       NewRunnableMethod<double>("MediaDecoderStateMachine::SetPlaybackRate",
     189             :                                 this,
     190             :                                 &MediaDecoderStateMachine::SetPlaybackRate,
     191           0 :                                 aPlaybackRate));
     192           0 :   }
     193             : 
     194             :   RefPtr<ShutdownPromise> BeginShutdown();
     195             : 
     196             :   // Set the media fragment end time.
     197           0 :   void DispatchSetFragmentEndTime(const media::TimeUnit& aEndTime)
     198             :   {
     199           0 :     RefPtr<MediaDecoderStateMachine> self = this;
     200           0 :     nsCOMPtr<nsIRunnable> r = NS_NewRunnableFunction(
     201             :       "MediaDecoderStateMachine::DispatchSetFragmentEndTime",
     202           0 :       [self, aEndTime]() {
     203             :         // A negative number means we don't have a fragment end time at all.
     204           0 :         self->mFragmentEndTime = aEndTime >= media::TimeUnit::Zero()
     205             :                                    ? aEndTime
     206           0 :                                    : media::TimeUnit::Invalid();
     207           0 :       });
     208           0 :     OwnerThread()->Dispatch(r.forget());
     209           0 :   }
     210             : 
     211             :   // Drop reference to mResource. Only called during shutdown dance.
     212           0 :   void BreakCycles() {
     213           0 :     MOZ_ASSERT(NS_IsMainThread());
     214           0 :     mResource = nullptr;
     215           0 :   }
     216             : 
     217           0 :   TimedMetadataEventSource& TimedMetadataEvent() {
     218           0 :     return mMetadataManager.TimedMetadataEvent();
     219             :   }
     220             : 
     221             :   MediaEventSource<void>& OnMediaNotSeekable() const;
     222             : 
     223             :   MediaEventSourceExc<UniquePtr<MediaInfo>,
     224             :                       UniquePtr<MetadataTags>,
     225             :                       MediaDecoderEventVisibility>&
     226           0 :   MetadataLoadedEvent() { return mMetadataLoadedEvent; }
     227             : 
     228             :   MediaEventSourceExc<nsAutoPtr<MediaInfo>,
     229             :                       MediaDecoderEventVisibility>&
     230           0 :   FirstFrameLoadedEvent() { return mFirstFrameLoadedEvent; }
     231             : 
     232             :   MediaEventSource<MediaEventType>&
     233           0 :   OnPlaybackEvent() { return mOnPlaybackEvent; }
     234             :   MediaEventSource<MediaResult>&
     235           0 :   OnPlaybackErrorEvent() { return mOnPlaybackErrorEvent; }
     236             : 
     237             :   MediaEventSource<DecoderDoctorEvent>&
     238           0 :   OnDecoderDoctorEvent() { return mOnDecoderDoctorEvent; }
     239             : 
     240             :   size_t SizeOfVideoQueue() const;
     241             : 
     242             :   size_t SizeOfAudioQueue() const;
     243             : 
     244             :   // Sets the video decode mode. Used by the suspend-video-decoder feature.
     245             :   void SetVideoDecodeMode(VideoDecodeMode aMode);
     246             : 
     247             : private:
     248             :   class StateObject;
     249             :   class DecodeMetadataState;
     250             :   class WaitForCDMState;
     251             :   class DormantState;
     252             :   class DecodingFirstFrameState;
     253             :   class DecodingState;
     254             :   class SeekingState;
     255             :   class AccurateSeekingState;
     256             :   class NextFrameSeekingState;
     257             :   class NextFrameSeekingFromDormantState;
     258             :   class VideoOnlySeekingState;
     259             :   class BufferingState;
     260             :   class CompletedState;
     261             :   class ShutdownState;
     262             : 
     263             :   static const char* ToStateStr(State aState);
     264             :   static const char* ToStr(NextFrameStatus aStatus);
     265             :   const char* ToStateStr();
     266             : 
     267             :   nsCString GetDebugInfo();
     268             : 
     269             :   // Functions used by assertions to ensure we're calling things
     270             :   // on the appropriate threads.
     271             :   bool OnTaskQueue() const;
     272             : 
     273             :   // Initialization that needs to happen on the task queue. This is the first
     274             :   // task that gets run on the task queue, and is dispatched from the MDSM
     275             :   // constructor immediately after the task queue is created.
     276             :   void InitializationTask(MediaDecoder* aDecoder);
     277             : 
     278             :   void SetAudioCaptured(bool aCaptured);
     279             : 
     280             :   RefPtr<MediaDecoder::SeekPromise> Seek(const SeekTarget& aTarget);
     281             : 
     282             :   RefPtr<ShutdownPromise> Shutdown();
     283             : 
     284             :   RefPtr<ShutdownPromise> FinishShutdown();
     285             : 
     286             :   // Update the playback position. This can result in a timeupdate event
     287             :   // and an invalidate of the frame being dispatched asynchronously if
     288             :   // there is no such event currently queued.
     289             :   // Only called on the decoder thread. Must be called with
     290             :   // the decode monitor held.
     291             :   void UpdatePlaybackPosition(const media::TimeUnit& aTime);
     292             : 
     293             :   bool CanPlayThrough();
     294             : 
     295             :   MediaStatistics GetStatistics();
     296             : 
     297           0 :   bool HasAudio() const { return mInfo.ref().HasAudio(); }
     298           0 :   bool HasVideo() const { return mInfo.ref().HasVideo(); }
     299           0 :   const MediaInfo& Info() const { return mInfo.ref(); }
     300             : 
     301             :   // Returns the state machine task queue.
     302           0 :   TaskQueue* OwnerThread() const { return mTaskQueue; }
     303             : 
     304             :   // Schedules the shared state machine thread to run the state machine.
     305             :   void ScheduleStateMachine();
     306             : 
     307             :   // Invokes ScheduleStateMachine to run in |aTime|,
     308             :   // unless it's already scheduled to run earlier, in which case the
     309             :   // request is discarded.
     310             :   void ScheduleStateMachineIn(const media::TimeUnit& aTime);
     311             : 
     312             :   bool HaveEnoughDecodedAudio();
     313             :   bool HaveEnoughDecodedVideo();
     314             : 
     315             :   // Returns true if we're currently playing. The decoder monitor must
     316             :   // be held.
     317             :   bool IsPlaying() const;
     318             : 
     319             :   // Sets mMediaSeekable to false.
     320             :   void SetMediaNotSeekable();
     321             : 
     322             :   // Resets all states related to decoding and aborts all pending requests
     323             :   // to the decoders.
     324             :   void ResetDecode(TrackSet aTracks = TrackSet(TrackInfo::kAudioTrack,
     325             :                                                TrackInfo::kVideoTrack));
     326             : 
     327             :   void SetVideoDecodeModeInternal(VideoDecodeMode aMode);
     328             : 
     329             : protected:
     330             :   virtual ~MediaDecoderStateMachine();
     331             : 
     332             :   void BufferedRangeUpdated();
     333             : 
     334             :   void ReaderSuspendedChanged();
     335             : 
     336             :   // Inserts a sample into the Audio/Video queue.
     337             :   // aSample must not be null.
     338             :   void PushAudio(AudioData* aSample);
     339             :   void PushVideo(VideoData* aSample);
     340             : 
     341             :   void OnAudioPopped(const RefPtr<AudioData>& aSample);
     342             :   void OnVideoPopped(const RefPtr<VideoData>& aSample);
     343             : 
     344             :   void AudioAudibleChanged(bool aAudible);
     345             : 
     346             :   void VolumeChanged();
     347             :   void SetPlaybackRate(double aPlaybackRate);
     348             :   void PreservesPitchChanged();
     349             : 
     350           0 :   MediaQueue<AudioData>& AudioQueue() { return mAudioQueue; }
     351           0 :   MediaQueue<VideoData>& VideoQueue() { return mVideoQueue; }
     352             : 
     353             :   // True if we are low in decoded audio/video data.
     354             :   // May not be invoked when mReader->UseBufferingHeuristics() is false.
     355             :   bool HasLowDecodedData();
     356             : 
     357             :   bool HasLowDecodedAudio();
     358             : 
     359             :   bool HasLowDecodedVideo();
     360             : 
     361             :   bool OutOfDecodedAudio();
     362             : 
     363           0 :   bool OutOfDecodedVideo()
     364             :   {
     365           0 :     MOZ_ASSERT(OnTaskQueue());
     366           0 :     return IsVideoDecoding() && VideoQueue().GetSize() <= 1;
     367             :   }
     368             : 
     369             : 
     370             :   // Returns true if we're running low on buffered data.
     371             :   bool HasLowBufferedData();
     372             : 
     373             :   // Returns true if we have less than aThreshold of buffered data available.
     374             :   bool HasLowBufferedData(const media::TimeUnit& aThreshold);
     375             : 
     376             :   void UpdateNextFrameStatus(NextFrameStatus aStatus);
     377             : 
     378             :   // Return the current time, either the audio clock if available (if the media
     379             :   // has audio, and the playback is possible), or a clock for the video.
     380             :   // Called on the state machine thread.
     381             :   // If aTimeStamp is non-null, set *aTimeStamp to the TimeStamp corresponding
     382             :   // to the returned stream time.
     383             :   media::TimeUnit GetClock(TimeStamp* aTimeStamp = nullptr) const;
     384             : 
     385             :   // Update only the state machine's current playback position (and duration,
     386             :   // if unknown).  Does not update the playback position on the decoder or
     387             :   // media element -- use UpdatePlaybackPosition for that.  Called on the state
     388             :   // machine thread, caller must hold the decoder lock.
     389             :   void UpdatePlaybackPositionInternal(const media::TimeUnit& aTime);
     390             : 
     391             :   // Update playback position and trigger next update by default time period.
     392             :   // Called on the state machine thread.
     393             :   void UpdatePlaybackPositionPeriodically();
     394             : 
     395             :   media::MediaSink* CreateAudioSink();
     396             : 
     397             :   // Always create mediasink which contains an AudioSink or StreamSink inside.
     398             :   already_AddRefed<media::MediaSink> CreateMediaSink(bool aAudioCaptured);
     399             : 
     400             :   // Stops the media sink and shut it down.
     401             :   // The decoder monitor must be held with exactly one lock count.
     402             :   // Called on the state machine thread.
     403             :   void StopMediaSink();
     404             : 
     405             :   // Create and start the media sink.
     406             :   // The decoder monitor must be held with exactly one lock count.
     407             :   // Called on the state machine thread.
     408             :   void StartMediaSink();
     409             : 
     410             :   // Notification method invoked when mPlayState changes.
     411             :   void PlayStateChanged();
     412             : 
     413             :   // Notification method invoked when mIsVisible changes.
     414             :   void VisibilityChanged();
     415             : 
     416             :   // Sets internal state which causes playback of media to pause.
     417             :   // The decoder monitor must be held.
     418             :   void StopPlayback();
     419             : 
     420             :   // If the conditions are right, sets internal state which causes playback
     421             :   // of media to begin or resume.
     422             :   // Must be called with the decode monitor held.
     423             :   void MaybeStartPlayback();
     424             : 
     425             :   // Moves the decoder into the shutdown state, and dispatches an error
     426             :   // event to the media element. This begins shutting down the decoder.
     427             :   // The decoder monitor must be held. This is only called on the
     428             :   // decode thread.
     429             :   void DecodeError(const MediaResult& aError);
     430             : 
     431             :   void EnqueueFirstFrameLoadedEvent();
     432             : 
     433             :   // Start a task to decode audio.
     434             :   void RequestAudioData();
     435             : 
     436             :   // Start a task to decode video.
     437             :   void RequestVideoData(const media::TimeUnit& aCurrentTime);
     438             : 
     439             :   void WaitForData(MediaData::Type aType);
     440             : 
     441           0 :   bool IsRequestingAudioData() const { return mAudioDataRequest.Exists(); }
     442           0 :   bool IsRequestingVideoData() const { return mVideoDataRequest.Exists(); }
     443           0 :   bool IsWaitingAudioData() const { return mAudioWaitRequest.Exists(); }
     444           0 :   bool IsWaitingVideoData() const { return mVideoWaitRequest.Exists(); }
     445             : 
     446             :   // Returns the "media time". This is the absolute time which the media
     447             :   // playback has reached. i.e. this returns values in the range
     448             :   // [mStartTime, mEndTime], and mStartTime will not be 0 if the media does
     449             :   // not start at 0. Note this is different than the "current playback position",
     450             :   // which is in the range [0,duration].
     451           0 :   media::TimeUnit GetMediaTime() const
     452             :   {
     453           0 :     MOZ_ASSERT(OnTaskQueue());
     454           0 :     return mCurrentPosition;
     455             :   }
     456             : 
     457             :   // Returns an upper bound on the number of microseconds of audio that is
     458             :   // decoded and playable. This is the sum of the number of usecs of audio which
     459             :   // is decoded and in the reader's audio queue, and the usecs of unplayed audio
     460             :   // which has been pushed to the audio hardware for playback. Note that after
     461             :   // calling this, the audio hardware may play some of the audio pushed to
     462             :   // hardware, so this can only be used as a upper bound. The decoder monitor
     463             :   // must be held when calling this. Called on the decode thread.
     464             :   media::TimeUnit GetDecodedAudioDuration();
     465             : 
     466             :   void FinishDecodeFirstFrame();
     467             : 
     468             :   // Performs one "cycle" of the state machine.
     469             :   void RunStateMachine();
     470             : 
     471             :   bool IsStateMachineScheduled() const;
     472             : 
     473             :   // These return true if the respective stream's decode has not yet reached
     474             :   // the end of stream.
     475             :   bool IsAudioDecoding();
     476             :   bool IsVideoDecoding();
     477             : 
     478             : private:
     479             :   // Resolved by the MediaSink to signal that all audio/video outstanding
     480             :   // work is complete and identify which part(a/v) of the sink is shutting down.
     481             :   void OnMediaSinkAudioComplete();
     482             :   void OnMediaSinkVideoComplete();
     483             : 
     484             :   // Rejected by the MediaSink to signal errors for audio/video.
     485             :   void OnMediaSinkAudioError(nsresult aResult);
     486             :   void OnMediaSinkVideoError();
     487             : 
     488             :   void* const mDecoderID;
     489             :   const RefPtr<AbstractThread> mAbstractMainThread;
     490             :   const RefPtr<FrameStatistics> mFrameStats;
     491             :   const RefPtr<VideoFrameContainer> mVideoFrameContainer;
     492             :   const dom::AudioChannel mAudioChannel;
     493             : 
     494             :   // Task queue for running the state machine.
     495             :   RefPtr<TaskQueue> mTaskQueue;
     496             : 
     497             :   // State-watching manager.
     498             :   WatchManager<MediaDecoderStateMachine> mWatchManager;
     499             : 
     500             :   // True if we've dispatched a task to run the state machine but the task has
     501             :   // yet to run.
     502             :   bool mDispatchedStateMachine;
     503             : 
     504             :   // Used to dispatch another round schedule with specific target time.
     505             :   DelayedScheduler mDelayedScheduler;
     506             : 
     507             :   // Queue of audio frames. This queue is threadsafe, and is accessed from
     508             :   // the audio, decoder, state machine, and main threads.
     509             :   MediaQueue<AudioData> mAudioQueue;
     510             :   // Queue of video frames. This queue is threadsafe, and is accessed from
     511             :   // the decoder, state machine, and main threads.
     512             :   MediaQueue<VideoData> mVideoQueue;
     513             : 
     514             :   UniquePtr<StateObject> mStateObj;
     515             : 
     516           0 :   media::TimeUnit Duration() const
     517             :   {
     518           0 :     MOZ_ASSERT(OnTaskQueue());
     519           0 :     return mDuration.Ref().ref();
     520             :   }
     521             : 
     522             :   // Recomputes the canonical duration from various sources.
     523             :   void RecomputeDuration();
     524             : 
     525             : 
     526             :   // FrameID which increments every time a frame is pushed to our queue.
     527             :   FrameID mCurrentFrameID;
     528             : 
     529             :   // The highest timestamp that our position has reached. Monotonically
     530             :   // increasing.
     531             :   Watchable<media::TimeUnit> mObservedDuration;
     532             : 
     533             :   // Returns true if we're logically playing, that is, if the Play() has
     534             :   // been called and Pause() has not or we have not yet reached the end
     535             :   // of media. This is irrespective of the seeking state; if the owner
     536             :   // calls Play() and then Seek(), we still count as logically playing.
     537             :   // The decoder monitor must be held.
     538             :   bool IsLogicallyPlaying()
     539             :   {
     540             :     MOZ_ASSERT(OnTaskQueue());
     541             :     return mPlayState == MediaDecoder::PLAY_STATE_PLAYING
     542             :            || mNextPlayState == MediaDecoder::PLAY_STATE_PLAYING;
     543             :   }
     544             : 
     545             :   // Media Fragment end time.
     546             :   media::TimeUnit mFragmentEndTime = media::TimeUnit::Invalid();
     547             : 
     548             :   // The media sink resource.  Used on the state machine thread.
     549             :   RefPtr<media::MediaSink> mMediaSink;
     550             : 
     551             :   const RefPtr<MediaDecoderReaderWrapper> mReader;
     552             : 
     553             :   // The end time of the last audio frame that's been pushed onto the media sink
     554             :   // in microseconds. This will approximately be the end time
     555             :   // of the audio stream, unless another frame is pushed to the hardware.
     556             :   media::TimeUnit AudioEndTime() const;
     557             : 
     558             :   // The end time of the last rendered video frame that's been sent to
     559             :   // compositor.
     560             :   media::TimeUnit VideoEndTime() const;
     561             : 
     562             :   // The end time of the last decoded audio frame. This signifies the end of
     563             :   // decoded audio data. Used to check if we are low in decoded data.
     564             :   media::TimeUnit mDecodedAudioEndTime;
     565             : 
     566             :   // The end time of the last decoded video frame. Used to check if we are low
     567             :   // on decoded video data.
     568             :   media::TimeUnit mDecodedVideoEndTime;
     569             : 
     570             :   // Playback rate. 1.0 : normal speed, 0.5 : two times slower.
     571             :   double mPlaybackRate;
     572             : 
     573             :   // If we've got more than this number of decoded video frames waiting in
     574             :   // the video queue, we will not decode any more video frames until some have
     575             :   // been consumed by the play state machine thread.
     576             :   // Must hold monitor.
     577             :   uint32_t GetAmpleVideoFrames() const;
     578             : 
     579             :   // Low audio threshold. If we've decoded less than this much audio we
     580             :   // consider our audio decode "behind", and we may skip video decoding
     581             :   // in order to allow our audio decoding to catch up. We favour audio
     582             :   // decoding over video. We increase this threshold if we're slow to
     583             :   // decode video frames, in order to reduce the chance of audio underruns.
     584             :   // Note that we don't ever reset this threshold, it only ever grows as
     585             :   // we detect that the decode can't keep up with rendering.
     586             :   media::TimeUnit mLowAudioThreshold;
     587             : 
     588             :   // Our "ample" audio threshold. Once we've this much audio decoded, we
     589             :   // pause decoding. If we increase mLowAudioThreshold, we'll also
     590             :   // increase this too appropriately (we don't want mLowAudioThreshold
     591             :   // to be greater than mAmpleAudioThreshold, else we'd stop decoding!).
     592             :   // Note that we don't ever reset this threshold, it only ever grows as
     593             :   // we detect that the decode can't keep up with rendering.
     594             :   media::TimeUnit mAmpleAudioThreshold;
     595             : 
     596             :   // Only one of a given pair of ({Audio,Video}DataPromise, WaitForDataPromise)
     597             :   // should exist at any given moment.
     598             :   using AudioDataPromise = MediaDecoderReader::AudioDataPromise;
     599             :   using VideoDataPromise = MediaDecoderReader::VideoDataPromise;
     600             :   using WaitForDataPromise = MediaDecoderReader::WaitForDataPromise;
     601             :   MozPromiseRequestHolder<AudioDataPromise> mAudioDataRequest;
     602             :   MozPromiseRequestHolder<VideoDataPromise> mVideoDataRequest;
     603             :   MozPromiseRequestHolder<WaitForDataPromise> mAudioWaitRequest;
     604             :   MozPromiseRequestHolder<WaitForDataPromise> mVideoWaitRequest;
     605             : 
     606             :   const char* AudioRequestStatus() const;
     607             :   const char* VideoRequestStatus() const;
     608             : 
     609             :   void OnSuspendTimerResolved();
     610             :   void CancelSuspendTimer();
     611             : 
     612             :   // True if we shouldn't play our audio (but still write it to any capturing
     613             :   // streams). When this is true, the audio thread will never start again after
     614             :   // it has stopped.
     615             :   bool mAudioCaptured;
     616             : 
     617             :   // True if all audio frames are already rendered.
     618             :   bool mAudioCompleted = false;
     619             : 
     620             :   // True if all video frames are already rendered.
     621             :   bool mVideoCompleted = false;
     622             : 
     623             :   // True if we should not decode/preroll unnecessary samples, unless we're
     624             :   // played. "Prerolling" in this context refers to when we decode and
     625             :   // buffer decoded samples in advance of when they're needed for playback.
     626             :   // This flag is set for preload=metadata media, and means we won't
     627             :   // decode more than the first video frame and first block of audio samples
     628             :   // for that media when we startup, or after a seek. When Play() is called,
     629             :   // we reset this flag, as we assume the user is playing the media, so
     630             :   // prerolling is appropriate then. This flag is used to reduce the overhead
     631             :   // of prerolling samples for media elements that may not play, both
     632             :   // memory and CPU overhead.
     633             :   bool mMinimizePreroll;
     634             : 
     635             :   // Stores presentation info required for playback.
     636             :   Maybe<MediaInfo> mInfo;
     637             : 
     638             :   mozilla::MediaMetadataManager mMetadataManager;
     639             : 
     640             :   // True if we've decoded first frames (thus having the start time) and
     641             :   // notified the FirstFrameLoaded event. Note we can't initiate seek until the
     642             :   // start time is known which happens when the first frames are decoded or we
     643             :   // are playing an MSE stream (the start time is always assumed 0).
     644             :   bool mSentFirstFrameLoadedEvent;
     645             : 
     646             :   // True if video decoding is suspended.
     647             :   bool mVideoDecodeSuspended;
     648             : 
     649             :   // True if the media is seekable (i.e. supports random access).
     650             :   bool mMediaSeekable = true;
     651             : 
     652             :   // True if the media is seekable only in buffered ranges.
     653             :   bool mMediaSeekableOnlyInBufferedRanges = false;
     654             : 
     655             :   // Track enabling video decode suspension via timer
     656             :   DelayedScheduler mVideoDecodeSuspendTimer;
     657             : 
     658             :   // Data about MediaStreams that are being fed by the decoder.
     659             :   const RefPtr<OutputStreamManager> mOutputStreamManager;
     660             : 
     661             :   // Media data resource from the decoder.
     662             :   RefPtr<MediaResource> mResource;
     663             : 
     664             :   // Track the current video decode mode.
     665             :   VideoDecodeMode mVideoDecodeMode;
     666             : 
     667             :   // Track the complete & error for audio/video separately
     668             :   MozPromiseRequestHolder<GenericPromise> mMediaSinkAudioPromise;
     669             :   MozPromiseRequestHolder<GenericPromise> mMediaSinkVideoPromise;
     670             : 
     671             :   MediaEventListener mAudioQueueListener;
     672             :   MediaEventListener mVideoQueueListener;
     673             :   MediaEventListener mAudibleListener;
     674             :   MediaEventListener mOnMediaNotSeekable;
     675             : 
     676             :   MediaEventProducerExc<UniquePtr<MediaInfo>,
     677             :                         UniquePtr<MetadataTags>,
     678             :                         MediaDecoderEventVisibility> mMetadataLoadedEvent;
     679             :   MediaEventProducerExc<nsAutoPtr<MediaInfo>,
     680             :                         MediaDecoderEventVisibility> mFirstFrameLoadedEvent;
     681             : 
     682             :   MediaEventProducer<MediaEventType> mOnPlaybackEvent;
     683             :   MediaEventProducer<MediaResult> mOnPlaybackErrorEvent;
     684             : 
     685             :   MediaEventProducer<DecoderDoctorEvent> mOnDecoderDoctorEvent;
     686             : 
     687             :   void OnCDMProxyReady(RefPtr<CDMProxy> aProxy);
     688             :   void OnCDMProxyNotReady();
     689             :   RefPtr<CDMProxy> mCDMProxy;
     690             :   MozPromiseRequestHolder<MediaDecoder::CDMProxyPromise> mCDMProxyPromise;
     691             : 
     692             :   const bool mIsMSE;
     693             : 
     694             : private:
     695             :   // The buffered range. Mirrored from the decoder thread.
     696             :   Mirror<media::TimeIntervals> mBuffered;
     697             : 
     698             :   // The duration explicitly set by JS, mirrored from the main thread.
     699             :   Mirror<Maybe<double>> mExplicitDuration;
     700             : 
     701             :   // The current play state and next play state, mirrored from the main thread.
     702             :   Mirror<MediaDecoder::PlayState> mPlayState;
     703             :   Mirror<MediaDecoder::PlayState> mNextPlayState;
     704             : 
     705             :   // Volume of playback. 0.0 = muted. 1.0 = full volume.
     706             :   Mirror<double> mVolume;
     707             : 
     708             :   // Pitch preservation for the playback rate.
     709             :   Mirror<bool> mPreservesPitch;
     710             : 
     711             :   // Whether to seek back to the start of the media resource
     712             :   // upon reaching the end.
     713             :   Mirror<bool> mLooping;
     714             : 
     715             :   // True if the media is same-origin with the element. Data can only be
     716             :   // passed to MediaStreams when this is true.
     717             :   Mirror<bool> mSameOriginMedia;
     718             : 
     719             :   // An identifier for the principal of the media. Used to track when
     720             :   // main-thread induced principal changes get reflected on MSG thread.
     721             :   Mirror<PrincipalHandle> mMediaPrincipalHandle;
     722             : 
     723             :   // Estimate of the current playback rate (bytes/second).
     724             :   Mirror<double> mPlaybackBytesPerSecond;
     725             : 
     726             :   // True if mPlaybackBytesPerSecond is a reliable estimate.
     727             :   Mirror<bool> mPlaybackRateReliable;
     728             : 
     729             :   // Current decoding position in the stream.
     730             :   Mirror<int64_t> mDecoderPosition;
     731             : 
     732             : 
     733             :   // Duration of the media. This is guaranteed to be non-null after we finish
     734             :   // decoding the first frame.
     735             :   Canonical<media::NullableTimeUnit> mDuration;
     736             : 
     737             :   // The status of our next frame. Mirrored on the main thread and used to
     738             :   // compute ready state.
     739             :   Canonical<NextFrameStatus> mNextFrameStatus;
     740             : 
     741             :   // The time of the current frame, corresponding to the "current
     742             :   // playback position" in HTML5. This is referenced from 0, which is the initial
     743             :   // playback position.
     744             :   Canonical<media::TimeUnit> mCurrentPosition;
     745             : 
     746             :   // Current playback position in the stream in bytes.
     747             :   Canonical<int64_t> mPlaybackOffset;
     748             : 
     749             :   // Used to distinguish whether the audio is producing sound.
     750             :   Canonical<bool> mIsAudioDataAudible;
     751             : 
     752             : public:
     753             :   AbstractCanonical<media::TimeIntervals>* CanonicalBuffered() const;
     754             : 
     755           0 :   AbstractCanonical<media::NullableTimeUnit>* CanonicalDuration()
     756             :   {
     757           0 :     return &mDuration;
     758             :   }
     759           0 :   AbstractCanonical<NextFrameStatus>* CanonicalNextFrameStatus()
     760             :   {
     761           0 :     return &mNextFrameStatus;
     762             :   }
     763           0 :   AbstractCanonical<media::TimeUnit>* CanonicalCurrentPosition()
     764             :   {
     765           0 :     return &mCurrentPosition;
     766             :   }
     767           0 :   AbstractCanonical<int64_t>* CanonicalPlaybackOffset()
     768             :   {
     769           0 :     return &mPlaybackOffset;
     770             :   }
     771           0 :   AbstractCanonical<bool>* CanonicalIsAudioDataAudible()
     772             :   {
     773           0 :     return &mIsAudioDataAudible;
     774             :   }
     775             : 
     776             : #ifdef XP_WIN
     777             :   // Whether we've called timeBeginPeriod(1) to request high resolution
     778             :   // timers. We request high resolution timers when playback starts, and
     779             :   // turn them off when playback is paused. Enabling high resolution
     780             :   // timers can cause higher CPU usage and battery drain on Windows 7.
     781             :   bool mHiResTimersRequested = false;
     782             :   // Whether we should enable high resolution timers. This is initialized at
     783             :   // MDSM construction, and mirrors the value of media.hi-res-timers.enabled.
     784             :   const bool mShouldUseHiResTimers;
     785             : #endif
     786             : };
     787             : 
     788             : } // namespace mozilla
     789             : 
     790             : #endif

Generated by: LCOV version 1.13