2013-09-22, 20:08
As I sit here listening to a Vocaloid (Hatsune Miku) singing through XBMC...
I wonder, would it be easy to invoke a Java Application as a visualization plugin?
I've got a Vocaloid avatar dancing on my phone as a background and it would be so nice if it was a visualization for XBMC.
I am about to start "playing" with MikuMikuStudio, the output of this could be a plugin for XBMC if I knew how to make a visualization. It uses OpenGL if that helps or is a problem?
It'd be especially nice if you played a specific song, you have the motion data for that song, the Vocaloid danced the correct dance for the song, and the correct avatar(s) dancing and singing etc.
Even down to beat detection and somehow making the system determine the best motion data for the tune. I don't know where it could lead. Just dreams for now.
Reading the descriptions so far, it sounds similar to a screensaver? Or is that a type of visualization?
I wonder, would it be easy to invoke a Java Application as a visualization plugin?
I've got a Vocaloid avatar dancing on my phone as a background and it would be so nice if it was a visualization for XBMC.
I am about to start "playing" with MikuMikuStudio, the output of this could be a plugin for XBMC if I knew how to make a visualization. It uses OpenGL if that helps or is a problem?
It'd be especially nice if you played a specific song, you have the motion data for that song, the Vocaloid danced the correct dance for the song, and the correct avatar(s) dancing and singing etc.
Even down to beat detection and somehow making the system determine the best motion data for the tune. I don't know where it could lead. Just dreams for now.
Reading the descriptions so far, it sounds similar to a screensaver? Or is that a type of visualization?