OfflineAudioContext
The OfflineAudioContext
interface is an AudioContext
interface representing an audio-processing graph built from linked together AudioNode
s. In contrast with a standard AudioContext
, an OfflineAudioContext
doesn't render the audio to the device hardware; instead, it generates it, as fast as it can, and outputs the result to an AudioBuffer
.
It is important to note that, whereas you can create a new AudioContext
using the new AudioContext()
constructor with no arguments, the new OfflineAudioContext()
constructor requires three arguments:
new OfflineAudioContext(numOfChannels,length,sampleRate);
This works in exactly the same way as when you create a new AudioBuffer
with the AudioContext.createBuffer
method. For more detail, read Audio buffers: frames, samples and channels from our Basic concepts guide. The arguments are:
- numOfChannels
- An integer representing the number of channels this buffer should have. Implementations must support a minimum 32 channels.
- length
- An integer representing the size of the buffer in sample-frames.
- sampleRate
- The sample-rate of the linear audio data in sample-frames per second. An implementation must support sample-rates in at least the range 22050 to 96000, with 44100 being the most commonly used.
Note: Like a regular AudioContext
, an OfflineAudioContext
can be the target of events, therefore it implements the EventTarget
interface.
Properties
Implements properties from its parent, AudioContext
.
OfflineAudioContext.length
Read only- An integer representing the size of the buffer in sample-frames.
Event handlers
OfflineAudioContext.oncomplete
- Is an
EventHandler
called when the processing is terminated, that is when thecomplete
event (of typeOfflineAudioCompletionEvent
) is raised.
Methods
Also implements methods from its parent, AudioContext
, and EventTarget
too.
OfflineAudioContext.resume()
- Resumes the progression of time in an audio context that has been suspended.
OfflineAudioContext.suspend()
- Schedules a suspension of the time progression in the audio context at the specified time and returns a promise.
OfflineAudioContext.startRendering()
- Starts rendering the audio, taking into account the current connections and the current scheduled changes. This is the event-based version.
OfflineAudioContext.startRendering_(promise)
- Starts rendering the audio, taking into account the current connections and the current scheduled changes. This is the newer promise version.
Example
In this simple example, we declare both an AudioContext
and an OfflineAudioContext
object. We use the AudioContext
to load an audio track via XHR (AudioContext.decodeAudioData
), then the OfflineAudioContext
to render the audio into an AudioBufferSourceNode
and play the track through. After the offline audio graph is set up, you need to render it to an AudioBuffer
using OfflineAudioContext.startRendering
.
When the startRendering()
promise resolves, rendering has completed and the output AudioBuffer
is returned out of the promise.
At this point we create another audio context, create an AudioBufferSourceNode
inside it, and set its buffer to be equal to the promise AudioBuffer
. This is then played as part of a simple standard audio graph.
Note: For a working example, see our offline-audio-context-promise Github repo (see the source code too.)
// define online and offline audio context var audioCtx = new AudioContext(); var offlineCtx = new OfflineAudioContext(2,44100*40,44100); source = offlineCtx.createBufferSource(); // use XHR to load an audio track, and // decodeAudioData to decode it and OfflineAudioContext to render it function getData() { request = new XMLHttpRequest(); request.open('GET', 'viper.ogg', true); request.responseType = 'arraybuffer'; request.onload = function() { var audioData = request.response; audioCtx.decodeAudioData(audioData, function(buffer) { myBuffer = buffer; source.buffer = myBuffer; source.connect(offlineCtx.destination); source.start(); //source.loop = true; offlineCtx.startRendering().then(function(renderedBuffer) { console.log('Rendering completed successfully'); var audioCtx = new (window.AudioContext || window.webkitAudioContext)(); var song = audioCtx.createBufferSource(); song.buffer = renderedBuffer; song.connect(audioCtx.destination); play.onclick = function() { song.start(); } }).catch(function(err) { console.log('Rendering failed: ' + err); // Note: The promise should reject when startRendering is called a second time on an OfflineAudioContext }); }); } request.send(); } // Run getData to start the process off getData();
Specifications
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'OfflineAudioContext' in that specification. |
Working Draft | Initial definition |
Browser compatibility
Feature | Chrome | Firefox (Gecko) | Internet Explorer | Opera | Safari (WebKit) |
---|---|---|---|---|---|
Basic support | 10.0webkit | 25.0 (25.0) | No support | 15.0webkit 22 (unprefixed) |
6.0webkit |
Promise-based startRendering() |
42.0 | 37.0 (37.0) | ? | ? | ? |
suspend() , resume() |
49.0 | ||||
length |
51.0 |
Feature | Android Webview | Firefox Mobile (Gecko) | Firefox OS | IE Mobile | Opera Mobile | Safari Mobile | Chrome for Android |
---|---|---|---|---|---|---|---|
Basic support | 33.0 | 26.0 | 1.2 | ? | ? | ? | (Yes) |
Promise-based startRendering() |
42.0 | 37.0 | 2.2 | ? | ? | ? | 42.0 |
suspend() , resume() |
49.0 | 49.0 | |||||
length |
51.0 | 51.0 |
See also
License
© 2016 Mozilla Contributors
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-us/docs/web/api/offlineaudiocontext