Welcome to my quiet little corner of the internet. Today, I’ll be going over building an audio visualizer.
In a time before the modern web, before the responsiveness, circular borders, and sleek shadows, there were loads of wonky, crazy looking, and out right broken websites. But they always had really cool parts to them that matured well, got tuned out, and have become the web as we know it now.
An audio visualizer is a cool widget like feature that displays the playing audio data in a pleasant visual graphical form. Here’s what we’re building:
You can view it here: https://undertale.humaidkhan.com/
To start with, I’m gonna list the html elements we need:
- A canvas tag, I’m going to be using the canvas tag to draw the audio information as there is a lot of animating data so the canvas data helps animate many moving parts easily
- An audio tag, to play any audio, we need the audio element.
- Lastly, the button to play and pause the audio, as we want the user to have control of their computer audio
The general idea is that we will retrieve the audio from a link, set the retrieved audio to the source of the audio tag and at the same time use it to draw some visuals on the canvas. The audio data that we retrieve is actually just an object whose key is the frequency and value is amplitude of that frequency.
One last thing before coding, when I started this project, I wanted to get the currently playing audio data and display a visualizer using it. To do that you would need to do program operating system dependent code so I left it for another time. The audio that you can use is limited, due to copyright issues, to your computer audio files or SoundCloud (you still have to give credit to the owner and add SoundCloud logo). You can’t use famous providers like Spotify or YouTube. Don’t worry too much if it’s just for your personal projects, just be aware if they find out that you’re not following their terms of service, they will send you a cease and desist and you’ll have to pull down your site.
So as I said, we need a canvas, audio, and button element:
Here, I’ve added the required HTML elements: the visualizer canvas, the audio-player, and the play-btn . I’ve also added fontawesome to get a nice looking play icon.
For CSS, I’ve centered the play button by moving it top 50% screen height, left 50% screen width and transforming it’s x and y by the button’s width and height respectively. I’ve also added a circle border by setting the border-radius to 100%. Fontawesome uses the font-size to determine the icon size, which I’ve set to 40px. I’ve also removed the button outline on focus (outline:none), so that the button doesn’t have a weird circle border. Note that this is bad for accessibility, as many screen readers use outline and focus to find buttons, but this is an audio “visualizer” so by definition it’s not good for accessibility.
Then, I stored the audio player state in the variable playingState, initializing it to the stopped state as the player is not playing and the track hasn’t been loaded yet.
When the user clicks the play button for the first time, the switch case state will pass through to the stopped case, load the track info, play the audio, set the state to playing (this happens in the getTrack function), and change the play icon to a pause icon (by removing fa-play class and adding the fa-pause icon).
Once the audio is playing and the user clicks the button, the audio player is paused, the play state is changed to paused, and the pause icon is replaced by the play icon (as we are now paused).
If the audio is paused and the user clicks the button, the audio is set to playing, the play state is set to playing, and the pause icon is replaced by the play icon.
When all the audio has finished the audio player calls the onended method (“ended is emitted”), which I have used to set the play state to stopped and replaced the pause icon with the play icon.
So, we need to complete the getTrack function which fetches the music stream, sets the source, and plays the audio. As mentioned above, I’ll be using SoundCloud to get my music, if you’re using local music just move your file to the same project folder, set the source (src) to the file and call the play method.
To use SoundCloud was a challenge as they haven’t provided API keys for a while. Also, the SoundCloud API has been update to API-V2 which means a lot of articles and methods to fetch the client id are deprecated.
Before looking at the code, I need to tell you a bit about CORS (cross origin resource sharing). When you build a website, there’s two parts to the site, the front-end (what the user sees) and the back-end (where the data is stored). Now when a user visits a site like www.google.com, the front-end is loaded in and displayed. As the user interacts with the website, data is loaded from the back-end (lets say it’s located at backend.google.com). By default, the browser has a security feature enabled that ensures the frontend only interacts with the url that the user has entered, which in this case is www.google.com. So the browser prevents access to backend.google.com.
One solution is that you can configure the backend to accept requests from www.google.com or any site which is perfect if we were building the backend. Another solution is to note that this is a “browser security feature”, so we can develop our own cors enabled “backend server” that simply passes all requests to backend.google.com, a proxy. One famous proxy is https://cors-anywhere.herokuapp.com/, just do a request https://cors-anywhere.herokuapp.com/<backend_url> and the browser will be able to bypass cors. I’ve used this proxy for the soundcloud api requests.
In the code above, I’ve done an api request to get track info to https://api-v2.soundcloud.com/tracks/<trackId>?client_id=<clientId> to get the track streams (transcodings). I found the url for the stream within the media.transcodings.url. Then, I performed an api request to this url which returned an object with the stream url. If you set the audio element source to the stream url and call the play method, the music will begin streaming from SoundCloud. Also note I’ve used async await to perform the get requests, since we need to load the urls before playing the audio.
Also note that I’ve added the anonymous cross audioplayer crossorigin requests to prevent any cors issue with the audio player.
To play the SoundCloud, you need retrieve the track id for the song you select and your own client id. To do so, go to the song you want to add, hit F12 to open the network panel and refresh the page. Look for the “/comments” request and you’ll be able to get your track id and client_id.
Add the track id and client_id and test out your application.
Here my track id is 272083179.
Add the track id and the client_id to the code below and you’ve got a working music player.
Now we just need to get the audio data and draw the visuals on the canvas. To capture the audio data, we need to create an analyzer component that intercepts the audio data from the audio element and passes it back to the audio element.
So I created an audioContext based on the web audio api and use it to create our analyser. I set the fast fourier time to 2048 (max), this is the amount of frequency range you want from your data. I captured the source audio and passed it to the analyser which then forwards it back to the audio destination(pass it back to the player). Finally, I defined a data that will store the amplitudes for each frequency. Note I’ve reduced it by 382 because the last few frequencies appeared to be really quiet (empty) and the visualizer didn’t look that good. Now that I have the analyser setup, I can read the data in to the data array using the getByteFrequencyData method on the analyser.
Note: we also need to add await audioCtx.resume(); to the getTrack function because the analyser context does not start when the browser loads unless stated explicitly.
Now that we have the data, we need to draw the information using the canvas tag.
So we set the width and height of the canvas when the page loads, to take the whole page (window.innerHeight/window.innerWidth). We also center the starting point to the middle of the canvas by setting it to half the height and half the width.
Canvas animations use a looping function that gets called at a rate of 60fps. The requestAnimationFrame sets the looping function and ensures its called at 60fps. In our looping function, we need to call the requestAnimationFrame function to move to ensure we move to the next frame. Also every loop, we get the audio data from the analyser and draw the visual graph. To do so, I’ve defined a draw function that takes the audio data.
Alright now for the math part. So we are going to be drawing lines while moving in a circle. The first thing to remember is that our button height and width were 120px by 120px, so the radius of our circle is 120.
A circle has 2 * pi radians (360 degrees) angles. The starting angle that canvas is the right side of the center at 0 radians. Since we want to start drawing at the top of the circle we need to set our start angle to 75% * 2 * pi = (2 * pi *3)/4
Also to get a position on the circle, we use the angle and radius to get the x-value as r * cos(angle) and the y-value as r * sin(angle) from the center. In our case, the coordinates would be x-value = centerX + r * cos(angle), y-value=centerY + r * sin(angle)
Finally, the number of values we need to display is data.length and to divide the circle equally between them, we equate the space between each as 2 * pi/data.length.
For each freq, amplitude pair, we have to display a line whose length depends on the amplitude starting from 75% * 2 * pi and adding space each time.
I initialized all the values as mentioned and cleared the canvas as the previous frame drawing will still be present.
The way we draw on the canvas is using paths, we begin a path, move to where we want to draw from, draw lines to other points, and finally closing our path. For every frequency, amplitude pair, I’ve drawn a line from a point on the circle at the current angle to the point + amplitude at the current angle. I’ve also colored the line using a gradient from the start point to the finish point using 3 colors at 0%, 30%, and 100%. I’ve also updated the x, y using the current angle value and updated the current angle value using the space variable.
Challenge: Something I added was that to only loop the canvas drawing animation (requestAnimationFrame) if the state is playing. Try adding that feature by yourself.
That’s it, we’ve learned how to build an audio visualizer. Here’s the completed code, just add your client_id from SoundCloud and try it out.
If you really want to get a good handle on the audio data, use the audio data to build another style of visualization, like maybe transform some divs using animejs.
So here’s a reflection on what you’ve learned:
- Using web audio html element to build your own player
- Using web audio api to get audio data from a source
- Getting the SoundCloud client_id and track id from any track
- Getting audio data stream from SoundCloud
- Drawing simple diagrams on the canvas in 2d
- Animating the canvas in 2d
- Using angles to compute positions across a circle
- Drawing audio data on the canvas based frequency and amplitude
Awesome work learning all that.
Thanks for reading, leave a comment if you like this sort of content and want to see more. I have 3 article ideas so far: the interview, building a react book carousel, and one on docker. Hope you enjoy them.