javascript - Precomputed Web Audio API Time Domain and spectrogram Visualization -
i'm synthesizing sound through web sound api using various oscillators/filters , have time domain , spectrogram visualizations run in real time oscillators play (similar here , here).
however, i'm wanting able create initial precomputed visualization based on sound network before run set amount of time user can view network sound before playing it. possible or there way speed time generate visualizations?
use offlineaudiocontext, gives pcm buffer back, asynchronously. compute windowed rms values out of (or utilize time domain, depends on want do), , set on or whatever.
the offlineaudiocontext lets run graph fast machine runs, , drop-in replacement audiocontext, apart 3 nodes, can't used (mediastreamaudiodestinationnode, mediastreamsourcenode, , mediaelementaudiosourcenode), because mediastream real-time objects: create no sense when not rendering in real-time.
it goes this:
var off = new offlineaudiocontext(2 /* stereo */, 44100 /* length in frames */, 44100 /* samplerate */); /* thing: setup graph usual */ off.createoscillator(...); ... ... /* called when graph has rendered 1 * sec of audio, here: 44100 frames @ 44100 frames per sec */ off.oncomplete = function(e) { e.renderedbuffer; /* audiobuffer contains pcm info */ }; /* kick off rendering */ off.startrendering(); javascript audio visualization web-audio
No comments:
Post a Comment