Create VideoReceiver with external VCMTiming object.

In order for the VCMTiming object to be correctly updated with decoding timings
when running the WebRTC-NewVideoJitterBuffer experiment the VCMTiming object
has to be available in both the VideoReceiver and the video_coding::FrameBuffer
class. Therefore the VCMTiming object is created in VideoRecieveStream and
then passed to VideoReceiver/video_coding::FrameBuffer as they are constructed.

BUG=webrtc:5514

Review-Url: https://codereview.webrtc.org/2575473004
Cr-Commit-Position: refs/heads/master@{#15638}
diff --git a/webrtc/modules/video_coding/video_coding_impl.h b/webrtc/modules/video_coding/video_coding_impl.h
index b5d04c4..e4fff1e 100644
--- a/webrtc/modules/video_coding/video_coding_impl.h
+++ b/webrtc/modules/video_coding/video_coding_impl.h
@@ -148,6 +148,7 @@
   VideoReceiver(Clock* clock,
                 EventFactory* event_factory,
                 EncodedImageCallback* pre_decode_image_callback,
+                VCMTiming* timing,
                 NackSender* nack_sender = nullptr,
                 KeyFrameRequestSender* keyframe_request_sender = nullptr);
   ~VideoReceiver();
@@ -208,7 +209,7 @@
   Clock* const clock_;
   rtc::CriticalSection process_crit_;
   rtc::CriticalSection receive_crit_;
-  VCMTiming _timing;
+  VCMTiming* _timing;
   VCMReceiver _receiver;
   VCMDecodedFrameCallback _decodedFrameCallback;
   VCMFrameTypeCallback* _frameTypeCallback GUARDED_BY(process_crit_);