OfflineAudioContext.startRendering()
The startRendering()
method of the OfflineAudioContext
Interface starts rendering the audio graph, taking into account the current connections and the current scheduled changes.
The complete
event (of type OfflineAudioCompletionEvent
) is raised when the rendering is finished, containing the resulting AudioBuffer
in its renderedBuffer
property.
Note: This version is to be superceded by the promise-based version
, but currently both mechanisms are provided for legacy reasons.
Syntax
offlineAudioCtx.startRendering(); offlineAudioCtx.oncomplete = function(e) { // e.renderedBuffer contains the output buffer }
Parameters
None.
Returns
Void.
Example
In this simple example, we declare both an AudioContext
and an OfflineAudioContext
object. We use the AudioContext
to load an audio track via XHR (AudioContext.decodeAudioData
), then the OfflineAudioContext
to render the audio into an AudioBufferSourceNode
and play the track through. After the offline audio graph is set up, you need to render it to an AudioBuffer
using OfflineAudioContext.startRendering
.
When the rendering has completed, the output AudioBuffer
is made available in the renderedBuffer
property of the OfflineAudioCompletionEvent
(available in OfflineAudioContext.oncomplete
handler when the complete
event fires.)
At this point we create another audio context, create an AudioBufferSourceNode
inside it, and set its buffer to be equal to the renderedBuffer
property. This is then played as part of a simple standard audio graph.
Note: For a working example, see our offline-audio-context Github repo (see the source code too.)
// define online and offline audio context var audioCtx = new AudioContext(); var offlineCtx = new OfflineAudioContext(2,44100*40,44100); source = offlineCtx.createBufferSource(); // define variables var pre = document.querySelector('pre'); var myScript = document.querySelector('script'); var play = document.querySelector('.play'); var stop = document.querySelector('.stop'); // use XHR to load an audio track, and // decodeAudioData to decode it and stick it in a buffer. // Then we put the buffer into the source function getData() { request = new XMLHttpRequest(); request.open('GET', 'viper.ogg', true); request.responseType = 'arraybuffer'; request.onload = function() { var audioData = request.response; audioCtx.decodeAudioData(audioData, function(buffer) { myBuffer = buffer; source.buffer = myBuffer; source.connect(offlineCtx.destination); source.start(); //source.loop = true; offlineCtx.startRendering(); }, function(e){"Error with decoding audio data" + e.err}); } request.send(); } // wire up buttons to stop and play audio, and range slider control getData(); offlineCtx.oncomplete = function(e) { var audioCtx = new (window.AudioContext || window.webkitAudioContext)(); var song = audioCtx.createBufferSource(); song.buffer = e.renderedBuffer; song.connect(audioCtx.destination); play.onclick = function() { song.start(); } console.log("completed!"); }
Specifications
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'startRendering()' in that specification. |
Working Draft |
Browser compatibility
Feature | Chrome | Firefox (Gecko) | Internet Explorer | Opera | Safari (WebKit) |
---|---|---|---|---|---|
Basic support | 10.0webkit | 25.0 (25.0) | No support | 15.0webkit 22 (unprefixed) |
6.0webkit |
Feature | Android | Android Webview | Firefox Mobile (Gecko) | Firefox OS | IE Mobile | Opera Mobile | Safari Mobile | Chrome for Android |
---|---|---|---|---|---|---|---|---|
Basic support | ? | (Yes) | 26.0 | 1.2 | ? | ? | ? | 33.0 |
See also
License
© 2016 Mozilla Contributors
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-us/docs/web/api/offlineaudiocontext/startrendering