Results 1 - 20 of 247

AudioContext.decodeAudioData()

This is the preferred method of creating an audio source for Web Audio API from an audio track.
API Audio audio AudioContext decodeAudioData Method Reference Web Audio API

IIRFilterNode

The IIRFilterNode interface of the Web Audio API is a AudioNode processor which implements a general infinite impulse response (IIR)  filter; this type of filter can be used to implement tone control devices and graphic equalizers as well. It lets the parameters of the filter response be specified, so that it can be tuned as needed.
API audio Audio IIRFilterNode Interface Reference Web Audio API

Basic concepts behind Web Audio API

The Web Audio API involves handling audio operations inside an audio context, and has been designed to allow modular routing. Basic audio operations are performed with audio nodes, which are linked together to form an audio routing graph. Several sources — with different types of channel layout — are supported even within a single context. This modular design provides the flexibility to create complex audio functions with dynamic effects.
audio Audio concepts Guide Media Web Audio API

AudioBufferSourceNode

AudioBufferSourceNode has no input and exactly one output. The number of channels in the output corresponds to the number of channels of the AudioBuffer that is set to the AudioBufferSourceNode.buffer property. If there is no buffer set—that is, if the attribute's value is NULL—the output contains one channel consisting of silence. An AudioBufferSourceNode can only be played once; that is, only one call to AudioBufferSourceNode.start() is allowed. If the sound needs to be played again, another AudioBufferSourceNode has to be created. Those nodes are cheap to create, and AudioBuffers can be reused across plays. It is often said that AudioBufferSourceNodes have to be used in a "fire and forget" fashion: once it has been started, all references to the node can be dropped, and it will be garbage-collected automatically.
API Audio AudioBufferSourceNode Interface Reference Web Audio API

AudioBufferSourceNode.buffer

If the buffer property is set to the value NULL, it defines a single channel of silence.
API Audio AudioBufferSourceNode Buffer Property Reference Web Audio API

AudioContext.createMediaStreamSource()

For more details about media stream audio source nodes, check out the MediaStreamAudioSourceNode reference page.
API Audio AudioContext createMediastreamSource Method Reference Web Audio API

AudioContext.resume()

The resume() method of the AudioContext Interface resumes the progression of time in an audio context that has previously been suspended.
API Audio AudioContext Method Reference resume Référence Web Audio API

AudioContext.suspend()

The suspend() method of the AudioContext Interface suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process — this is useful if you want an application to power down the audio hardware when it will not be using an audio context for a while.
API Audio AudioContext Method Reference Référence suspend Web Audio API