Unity3D Retreat
I saw a bunch of cool projects in Unity3D in the last month and wanted to try out what they did myself.
Vera and Queenie made an exhibition of sorts out of stories they collected of cab rides. I saw their project through various stages as it grew and grew, but was drawn especially by their use of music-based interaction using FFTs, and shaders to hide just parts of objects out of view.
Pete, Anvay, and Dipika made a beautiful 2D scene using Unity's animation rigging and included touch-based interactions to prompt the player to advance the story.
Dror, Neeti, and Mary used Unity's Visual Effect Graph and motion capture to create an animated dancer using particles as their texture, in a tribute to political prisoners.
I'm compiling some of my research into their work with the hope that I can implement some of the tools they used in projects of my own.
Audio Visualization Using FFTs

- Both
AudioSource
andAudioListener
have aGetSpectrumData
method, but both the example in the docs and in the tutorial above only use theAudioListener
version. What's the difference between the two? - As far as I understand, the audio spectrum is the relative volumes of different frequencies at a given point in the audio. You can separate these frequencies into n bins where n is a power of 2 greater than or equal to 64.
- The tutorial only uses the first value of the spectrum for the example – does that correspond to the lowest frequency values, i.e., what would most likely give us the beat?
- Can we stream audio directly from a source instead of using audio files stored inside the game? See Audio Streaming Component.
Creating The Outline of a Body with the Visual Effect Graph
- While Unity's particle system allows you to emit particles from a mesh, e.g., the surface of a cube, I haven't figured out how to get these particles to simply rest on the surface instead of escaping with some velocity. Might be worth experimenting with the settings a little bit more since the VFX Graph seems to be more resource intensive.
- Using the VFX Graph requires URP or HDRP and the SDF Bake Tool is only available for Unity 2021. I had to upgrade to 2021 and change the render settings accordingly. The steps are as follows: (1) Install Universal RP from the Unity Registry in Package Manager; (2) Create > Renderer > URP Asset (with Universal Renderer), this will add two URP objects to your project; (3) Edit > Project Settings > Graphics, choose the new URP Asset in Scriptable Render Pipeline Settings; (4) Edit > Rendering > Materials > Convert Selected Built-In Materials to URP. This process seems like a pain in the ass if you're already well into your project, so I think I would usually just opt for creating a new project from scratch.
- I managed to render particles over the surface of a cube, following the tutorials above. I wanted to render 3D particles, but realized that a large volume of particles in addition to rotations, turbulence, and other effects create the impression of being 3D. VFX Graph doesn't let you render on a mesh by default – in order to do so you need to change the Visual Effects settings under Preferences to allow experimental operators/blocks. As it turns out, the skinned mesh renderer block doesn't let me generate particles over the surface of my humanoid model, so I need to investigate further.
- Look into the SDF Bake Tool – might be what I need to render particles over a mesh like a body. See also Keijiro's repository for how SDF can be used to render particles over complex surfaces.
Importing Humanoid Models into Unity with Mixamo
- I've had an allergy to importing 3D models into Unity so I've avoided anything with animations for a long time, but the video above shows that it's a relatively painless process if you know the steps.
- Unity prefers using .fbx models. Avoid .obj as their materials cannot be imported.
Streaming Sound into Unity
The next piece of visualizing audio that I was hoping to accomplish was sourcing the audio without having to download or take up memory in the scene itself. In other words, stream sound directly into a game. Unfortunately, it's not as simple as I thought – some solutions are out there but either cost money or are poorly documented. I summarize them below:
AudioStream for Unity

Radio PRO

BASS.NET
Soundcloud API

UnityWebRequest from Server

- This is the most vanilla option, but requires having some a server set up with music or video assets. Seeing that part of what I'm trying to avoid is re-uploading audio somewhere, and that (I think) I can't use a simple solution like a Google Drive or Dropbox link, it doesn't seem like the solution I want.
- It would be relatively easy to make an S3 bucket, though. Technically this website also has a MySQL server already. Given that I'll be using a server eventually for my other projects I might want to get familiar with those steps right now.
- I can use Filezilla to access my server and edit a directory that contains the assets I want to edit. Filezilla is easy as long as I have the .pem key.

Best Workaround might be Unity VideoPlayer?
- Very easy to set up, but doesn't work with YouTube links if streaming. However, you can use Vimeo links (as well as other sites), and Vimeo allows livestreaming – in theory, then, you can stream music that way.
Revisiting Making Particle Outlines
As for some more good news, I figured out how to use the Particle System to provide a character with a silhouette made of particles! To start, I did try the SDF Bake Tool (Window > Visual Effects > Utilities > SDF Bake Tool) and while it did do a good job of capturing the shape of the mesh and let me apply the SDF to the VFX Graph, the resulting particles wouldn't move with the model, so it wouldn't work with my animated character.
To use the Particle System, first I reduced the start speed of the particles to zero, and their start size to 0.05 or 0.01, so that they could be used to give some resolution to the model's features. I increased the emission rate to 1000–2000, which also meant increasing the max particles. I changed the shape into a skinned mesh renderer, attaching the mesh attached to the joints and body of the model in the scene. I had initially made the mistake of using the meshes attached to the prefab in Assets, but it makes sense that the mesh should be from the scene itself.

The model's body is a single skinned mesh, and the Particle System will distribute all of the particles over the entire mesh, meaning areas that need more definition (e.g., the head) will not necessarily have more particles attached. While I couldn't achieve the resolution I'd hoped for, at least for this model, it's nice to have this visual effect in my pocket. Just changing the lifetime, size, and type (under Shape) of the particles helps me get close, too.
Revisiting Delegates & Callbacks
I've used delegates and callbacks before and it took me a minute to understand them, and after not using them for a while once again I needed a review. Unfortunately, it doesn't look like there are too many resources explaining their usefulness in Unity, so I've collected a few explanations that make sense to me:




Unfortunately, the Unity tutorial offers the least explanation of these links, but at least gives an example of application.
A delegate describes a method and its parameters. An instance of a delegate, or a handler, can be assigned any number of methods derived from the delegate. These methods must have the same type parameters and return value. When the delegate is invoked, these methods will be called in the order that they were assigned (if there are multiple assigned using the += operator). The delegate can be invoked as a callback by adding it as a parameter to another method that does not need to have the same parameters as the delegate. Consider the following example:
public delegate void SendGreetings(string name);
public static SendGreetings SendGreetingsHandler;
private void Start() {
SendGreetingsHandler += SayHappyHolidays;
SendGreetingsHandler += SayHappyNewYear;
}
private void SayHappyHolidays(string name) {
Debug.Log("Happy Holidays, " + name + "!");
}
private void SayHappyNewYear(string name) {
Debug.Log(name + ", I hope you have a wonderful New Year!");
}
private void SayGreetings(string firstName, string lastName, SendGreetings callback) {
// all methods assigned to the delegate will run
callback(firstName + " " + lastName);
}
private void OnMouseDown() {
SayGreetings("Mashi", "Zaman", SendGreetingsHandler);
}
Maybe this isn't a great example, but I think I got all the parts.
Using Live Audio Input From Microphone
- Livestreaming from Vimeo or something like it is one option, but I can also use a microphone to record sound from a live performance.