By mapping between musical notes and colors this project seeks to bring attention to Interactive art catered to people that are visually impaired or have no functional hearing. Using machine learning, I trained my model with 5 specific movements, each movement generates a sound of the solfège (DO-RE-ME-FA-SOL) each musical tone has a color assigned. “Sound can be seen, and color can be heard”. I was inspired by by the prompt below:
If you take the frequencies of each note in Hz (in equal temperament) and multiply them by 240 (40 octaves), you get a number in the THz, which would fall into the visual range if it represented the frequency of an electromagnetic wave instead of a sound wave.