从NodeJS服务器获取实时流式音频到客户端

时间:2023-01-19 18:58:21

I need to have a real time live audio stream from 1 client to a server to multiple listener clients.

我需要从1个客户端到服务器到多个侦听器客户端的实时实时音频流。

Currently I have the recording from the client working and stream the audio through socket.io to the server. The server receives this data and must stream the audio (also through socket.io?) to the clients that want to listen to this stream. It must be as real time as possible (minimize delay).

目前我从客户端进行录音工作,并通过socket.io将音频流传输到服务器。服务器接收此数据,并且必须将音频(也通过socket.io?)传输到想要侦听此流的客户端。它必须尽可能实时(最小化延迟)。

I'm using GetUserMedia to record the microphone (browser compatibility is not important here). I want the clients to use HTML5 audio tag to listen to the stream. The data received on the server are chunks (currently packed by 700) packed in a blob with type audio/wav.

我正在使用GetUserMedia来录制麦克风(浏览器兼容性在这里并不重要)。我希望客户端使用HTML5音频标签来收听流。在服务器上接收的数据是打包在带有audio / wav类型的blob中的块(当前由700打包)。

This is my code to send it to the server:

这是我将代码发送到服务器的代码:

mediaRecorder.ondataavailable = function(e) {
    this.chunks.push(e.data);
    if (this.chunks.length >= 700)
    {
        this.sendData(this.chunks);
        this.chunks = [];
    }
};
mediaRecorder.sendData = function(buffer) {
    blob = new Blob(buffer, { 'type' : 'audio/wav' });
    socket.emit('voice', blob);
}

On the server I'm able to send the chunks to the client the same way like this:

在服务器上,我能够以相同的方式将块发送到客户端:

socket.on('voice', function(blob) {
    socket.broadcast.emit('voice', blob);
});

On the client I can play this like this:

在客户端我可以这样玩:

var audio = document.createElement('audio');
socket.on('voice', function(arrayBuffer) {
    var blob = new Blob([arrayBuffer], { 'type' : 'audio/wav' });
    audio.src = window.URL.createObjectURL(blob);
    audio.play();
});

This works for the first blob of chunks I send but you're not allowed to keep changing to audio.src to new URL source so this is not a working solution.

这适用于我发送的第一个blob块,但是你不允许继续将audio.src更改为新的URL源,因此这不是一个有效的解决方案。

I think I have to create some kind of stream on the server which I can put in the audio tag of the HTML5 on the listening clients but I don't know how. The received blobs with chunks should than be appended to this stream in real time.

我想我必须在服务器上创建某种流,我可以在听取客户端上放入HTML5的音频标签,但我不知道如何。接收到的带有块的blob应该实时附加到此流。

What is the best approach to do this? Am I doing it right from client microphone to server?

这样做的最佳方法是什么?我是从客户端麦克风到服务器吗?

2 个解决方案

#1


2  

I'm a bit late to the party here but it looks like the web audio API will be your friend here, if you haven't already finished it. It allows you to play an audio stream directly to the output device without messing around with attaching it to an audio element.

我在这里参加派对有点晚了但是看起来网络音频API将成为你的朋友,如果你还没有完成它。它允许您直接将音频流播放到输出设备,而无需将其附加到音频元素。

I'm looking at doing the same thing and your question has answered my question - how to get data from client to server. The benefit of the web audio API is the ability to add streams together and apply audio effects to it on the server.

我正在考虑做同样的事情,你的问题已经回答了我的问题 - 如何从客户端获取数据到服务器。 Web音频API的好处是能够将流一起添加并在服务器上将音频效果应用于它。

MDN Web Audio API

MDN Web Audio API

The io events should replace the data in an audio buffer object in the audio context. Audio processing can happen in the nodeJS web audio context before being emitted as a single stream to each client.

io事件应替换音频上下文中音频缓冲区对象中的数据。音频处理可以在nodeJS Web音频上下文中发生,然后作为单个流发送到每个客户端。

#2


1  

You could change audio src dynamically as follows (assuming mp3 type):

您可以动态更改音频src,如下所示(假设为mp3类型):

<audio id="audio" controls="controls">
    <source id="mp3Source" type="audio/mp3"></source>
        Your browser does not support the audio format.
</audio>

Call following function whenever, socket event is received :

收到套接字事件时,调用以下函数:

function updateSource() { 
        var audio = document.getElementById('audio');

        var source = document.getElementById('mp3Source');
        source.src= <blob>;

        audio.load(); //call this to just preload the audio without playing
        audio.play(); //call this to play the song right away
    }

#1


2  

I'm a bit late to the party here but it looks like the web audio API will be your friend here, if you haven't already finished it. It allows you to play an audio stream directly to the output device without messing around with attaching it to an audio element.

我在这里参加派对有点晚了但是看起来网络音频API将成为你的朋友,如果你还没有完成它。它允许您直接将音频流播放到输出设备,而无需将其附加到音频元素。

I'm looking at doing the same thing and your question has answered my question - how to get data from client to server. The benefit of the web audio API is the ability to add streams together and apply audio effects to it on the server.

我正在考虑做同样的事情,你的问题已经回答了我的问题 - 如何从客户端获取数据到服务器。 Web音频API的好处是能够将流一起添加并在服务器上将音频效果应用于它。

MDN Web Audio API

MDN Web Audio API

The io events should replace the data in an audio buffer object in the audio context. Audio processing can happen in the nodeJS web audio context before being emitted as a single stream to each client.

io事件应替换音频上下文中音频缓冲区对象中的数据。音频处理可以在nodeJS Web音频上下文中发生,然后作为单个流发送到每个客户端。

#2


1  

You could change audio src dynamically as follows (assuming mp3 type):

您可以动态更改音频src,如下所示(假设为mp3类型):

<audio id="audio" controls="controls">
    <source id="mp3Source" type="audio/mp3"></source>
        Your browser does not support the audio format.
</audio>

Call following function whenever, socket event is received :

收到套接字事件时,调用以下函数:

function updateSource() { 
        var audio = document.getElementById('audio');

        var source = document.getElementById('mp3Source');
        source.src= <blob>;

        audio.load(); //call this to just preload the audio without playing
        audio.play(); //call this to play the song right away
    }