Such a great tutorial. Your way of teaching is really clear and concise. You should look into doing a full unity tutorial series from beginner to advanced.
I'm here in 2022 - this is by far the best tutorial I've seen about audio generation, every darn line of code still works in Unity 2021 and I learned a truckload! :) I wish to add random beeps to my game now, and wish I had seen this when I was working on my game projects at uni! :) Thank you for making this video and sharing so much information :)
i think you wanna use phase -= Mathf.PI*2 instead of reseting to 0 to get a smoother sine wave. Nice tutorial however. This opened my eyes to making my own synth
If anyone is getting a glitchy sound instead of a smooth sine wave: change the code at 3:50 from "phase = 0.0" to "phase -= Mathf.PI * 2" That way it maintains a smooth wave
Fantastic tutorial! I would love to see a second tutorial or possibly the project files on git to further study and find out how the synth works exactly! Thank you!
Great tutorial, thanks! It really helped me understand OnAudioFilterRead a little better and I'll definitely be using that knowledge in my own project :)
hey man. cool video! i'm currently trying to code a script that enabled me to process every sample of an audio source differently. do you have a video or can you tell me how to go about that?
Super great tutorial! I have tried extending it so one game object can play multiple notes by adding the sin waves for each note (i.e. wave[i] += (the maths bit) in a loop through all active frequencies) , but this sounds pretty bad. What's the best way to add 2 or more sound waves together?
Is it possible to use OnAudioFilterRead to change tempo? Any idea how to do it? I want to make a music's tempo proportional to an event without messing with the pitch but I don't know shit about audio and it's difficult to find documentation on that matter, nevermind accessible documentation.
Great tutorial! I have a question. If i want to make a piano where you can press 2 or more notes at same time, is necessary create another oscillator to make the other sound?
Do you have a solution to the popping sound when you press and release the spacebar? You can hear it right around 6:30, and I get exactly the same artifacting when I run the code myself.
I found a solution. Popping is caused by sudden, sharp changes to the gain. Going from 0 to 0.1 gain in one sample has the chance to create a sharp ridge in the waveform, which is very likely to be audible. I solved this issue by attenuating the change in gain over a few milliseconds. Smoothing the volume in this way removes the artifacts from the samples without perceptably change the sound.
Hey , In this example, Are you playing external synthesizer ? As I am also trying something like this. Can we produce and playback same sound inside unity3d that is played in any external synthesizer. Thanks in Advance.
You could use Keijiro's MIDI Jack if you wanted to use a MIDI keyboard to play the synthesizer (or anything else) in Unity > github.com/keijiro/MidiJack
hello, on this site (www.visipiano.com/midi-to-json-converter/) I can convert a midi file to json, how can I see the lyrics of the song and their durations to play them? at unity? has how to convert midi to json directly by unity?
Awesome tutorial, first of all. I'm getting an error though. When I had to try incrementing the frequency on every Spacebar usage, on " frequency = frequencies[thisFreq];" I got this error: "IndexOutOfRangeException: Array index is out of range." My array frequencies is clearly not working . May you help me?
Hey man great video! Gotta ask a question though. After I have built this I wanted to add buttons that played some beat loops in the background for me to jam with it, but it seems that OnAudioFilterRead blocks any audio that is not from this synthesizer. Glad to hear some help:):):):)
The data parameter in OnAudioFilterRead is a buffer of individual raw audio samples. You could write that buffer to a WAV file every time the function is called, keeping in mind that samples for each channel are interlaced.
I'm using your sine waveform formula to create triangle, square and other waveforms by adding every sinusoid in the data[i] = ... expression, but it doesn't come as clear as your triangle formula, should I just add more sinusoids?
Dano Kablamo I know, I can't do that though because I want different sound sources to play at the same time, and that resulted in a buggy mess where the second one wouldn't play
Dano Kablamo that's the problem! the second one isn't playing any sound at all! I'm pretty sure it receives input too, it just doesn't play anything (i am using vive headset to create a vr analog synth, maybe it has an audio listener somewhere bugging it all?)
I can't remember how I did it in this tutorial, but if you are using AudioListener.OnAudioFilterRead, switch that to AudioSource.OnAudioFilterRead or something like it to use multiple at once.
Real instruments are pretty tricky, they're sounds are made up of multiple different pitched tones alongside the main tone, as well as other differences in the shape of the wave caused by how the instrument behaves in the real world.
Magic! Jk it's a line renderer and you use OnAudioFilterRead, except you don't change the data, you read it. It's complicated AF. I'll make a tutorial for it too.