Here is my use case: Alice has a cool new media track that she wants Bob to listen in to. She selects the media file in her browser and the media file starts playing instantly in Bob's browser.
I'm not even sure if this is possible to build using WebRTC API right now. All examples I can find use streams obtained via getUserMedia() but this is what I have:
var context = new AudioContext();var pc = new RTCPeerConnection(pc_config);function handleFileSelect(event) { var file = event.target.files[0]; if (file) { if (file.type.match('audio*')) { console.log(file.name); var reader = new FileReader(); reader.onload = (function(readEvent) { context.decodeAudioData(readEvent.target.result, function(buffer) { var source = context.createBufferSource(); var destination = context.createMediaStreamDestination(); source.buffer = buffer; source.start(0); source.connect(destination); pc.addStream(destination.stream); pc.createOffer(setLocalAndSendMessage); }); }); reader.readAsArrayBuffer(file); } }}On the receiving side I have the following:
function gotRemoteStream(event) { var mediaStreamSource = context.createMediaStreamSource(event.stream); mediaStreamSource.connect(context.destination);}This code does not make the media (music) play on the receiving side. I do however receive an endedevent right after the WebRTC handshake is done and the gotRemoteStream function was called. ThegotRemoteStream function gets called the media does not start playing.
On Alice's side the magic is suppose to happen in the line that says source.connect(destination). When I replace that line with source.connect(context.destination) the media start playing correctly through Alice's speakers.
On Bob's side a media stream source is created based upon Alice's stream. However when the local speaker are connected using mediaStreamSource.connect(context.destination) the music doesn't start playing through the speakers.
Off course I could always send the media file through a DataChannel but where is the fun in that...
Any clues on what is wrong with my code or some ideas on how to achieve my use case would be greatly appreciated!
I'm using the latest and greatest Chrome Canary.
Thanks.
Due to an error in my code the received stream on Bob's side was ended because the SDP answer on Alice's side was not handeld correctly. After fixing the issue the media still does not play but the example behaves differently. I updated the question accordingly. –
Eelco Jul 10 '13 at 11:21
it might be unrelated (I have no experience with webRTC) but could
github.com/wearefractal/holla help you
视频文件读取mediaSource 播放
var video = document.querySelector('video');var assetURL = 'frag_bunny.mp4';// Need to be specific for Blink regarding codecs// ./mp4info frag_bunny.mp4 | grep Codecvar mimeCodec = 'video/mp4; codecs="avc1.42E01E, mp4a.40.2"';if ('MediaSource' in window && MediaSource.isTypeSupported(mimeCodec)) { var mediaSource = new MediaSource; //console.log(mediaSource.readyState); // closed video.src = URL.createObjectURL(mediaSource); mediaSource.addEventListener('sourceopen', sourceOpen);} else { console.error('Unsupported MIME type or codec: ', mimeCodec);}function sourceOpen (_) { //console.log(this.readyState); // open var mediaSource = this; var sourceBuffer = mediaSource.addSourceBuffer(mimeCodec); fetchAB(assetURL, function (buf) { sourceBuffer.addEventListener('updateend', function (_) { mediaSource.endOfStream(); video.play(); //console.log(mediaSource.readyState); // ended }); sourceBuffer.appendBuffer(buf); });};function fetchAB (url, cb) { console.log(url); var xhr = new XMLHttpRequest; xhr.open('get', url); xhr.responseType = 'arraybuffer'; xhr.onload = function () { cb(xhr.response); }; xhr.send();};
本站仅提供存储服务,所有内容均由用户发布,如发现有害或侵权内容,请
点击举报。