cables DocumentationHow To UseWorking with filesKeyboard ShortcutsUser Interface WalkthroughBeginner TutorialBeginner 1: Drawing A CircleBeginner 2: TransformationsBeginner 3: ColorMore TransformationsIntermediateImage CompositionsPost-Processing 3D ScenesExporting And EmbeddingHow to serve files externally and fix CORS headersExporting PatchesExport using the cables command line interfaceExport via IframeExport creating a standalone executableExport to github (pages)Export and deploy to netlifyExport full PatchExport a ZIP fileExternal triggers / functionsUsing variablesPreviewing / uploading exported cables patchesExamples for EmbeddingPermissionsUsersPatchesTeamsOpsMultiplayerPatchlistsCoding OpsCreating AttachmentsGeneral op/Port CallbacksPortsMultiPortsArray PortsBoolean portsInteger Number PortsObject PortsString portsTrigger PortsFloating Point Number PortsGUI/UI attributesHello Op - Part 1LibrariesDeveloping OpsRenaming / Creating a new versionCreating Viz OpsGuidelinesObject PortsPatching Ops / SubPatchOpsWriting ShadersWeb Audio Op DevelopmentDeveloping CablesRepositoriesSet up local environmentScriptsHow to work with forksGenerated DocumentationUsing standalone to develop cablesCables StandaloneGeneralCoding OpsSharing opsUsing NPMImport/ExportStandalone - FAQLightingLightsShadowsWorking With AudioBasic Audio SetupWorking with EffectsReal-Time Audio Analyzation & Audio VisualizationOffline Audio Visualization & AnalyzationCommunicationcables APIOptimizing Performance In PatchesTools for debugging your patchHow to optimize cables patches with the Performance opHow to optimize cables patches with the ProfilerCommon pitfalls with the "usual suspects"Optimizing arraysDebugging Shaders with ShaderInfoFAQAudio in web browsersHow to make demoscene demos with cables.glEmbeddingHow to integrate my cables patch into my CMS (webflow/wix/squarespace/...)?How to remove grey rectangles on touch (mobile)?Why doesn't the DownloadTexture op work on iOS?How to disable page scrolling on mobile?Mobile tippsHow to run exported patches on your local machineTransparent CanvasFeatures and SupportHow to contribute code to cablesWill there be support for (animated) GIFs?Can i use a website as a texture?Screenshots and Video recordingHow to report a bug in cablesHow can I share a patch to get help?How can I support cables?Video playback in cablesGeneral questionsWhat is dev.cables.glHTML And CSS In CablesJavascript Frameworkscordova / phonegapelectronreactvuejsLicenses and paymentWhat license do i need to use cables?Will I have to pay for it in the future?How is my work licensed when using cables?Does cables support midi and OSC?Patch PermissionsMy User Profile & Social MediaShadertoyCables at schools and universitiesTechnical questionsWebGL1 and WebGL2

Real-Time Audio Analyzation & Audio Visualization

To be able to generate visuals driven by sound, you need to extract either the current amplitude or the amplitude of a certain frequency region (say you want to trigger a visual when a kick drum is played; you would only want to consider the bass frequencies of your sound to interact with the visuals).

We offer you three operators to analyze sound and extract various information from them.

The first operator is the AudioAnalyzer. This op provides you with amplitude outputs and FFT transformed array outputs. Have a look at this example patch for a simple overview of the controls. If you want a showcase of its abilities, refer to this patch to see it in action. It's output is also used for the following two operators.

The second op to analyze audio with is the FFTAreaAverage op. It needs the FFT Array output of the AudioAnalyzer as an input to work. This op lets you specify a rectangle in the frequency spectrum of your file. The op will only analyze the corresponding frequency and amplitude range. The width of the rectangle is the frequency range and the height of the rectangle is the amplitude range. Have a look at the example patch to get further insights.

The third op is the AnalyzerTexture op. It also takes the FFT Array output of the AudioAnalyzer and creates a black and white texture from it. This op is useful for texture masks in postprocessing or sound-dependent altering of visual aspects of objects using materials. Please refer to the example patch. You could for instance use the texture output as an input to the opacity texture of a BasicMaterial.

With these three ops, you are able to drive anything responding to sound. If you use audio effects before your analyzation (so that they are not heard), you can manipulate the audio signal in parallel while the original sound is played back. A whole world of new visualization possibilities opens up.

A patch to make a circle change color & opacity could look like this:

rt_vis_example


Found a problem? Edit this file on github and contribute to cables!