An audio visualizer is no small task, and it would probably take longer to create than your entire game. You need to:
- Analyze the incoming audio data. You can't use Slick's sound engine for this; so you'd need to extend or re-write it to handle the incoming audio data (would likely require a solid understanding of OpenAL and audio decoding). The least painful thing to do would be to analyze only a single streaming source (i.e. background music).
- You would then use something like a FFT (fast Fourier transform) to analyze the incoming audio data
- You would then transform shapes/images based on various mathematical functions that you describe
The flashy/pulsing effect would likely be created with 3D shapes, a lot
of complicated mathematics, and various materials/lights/etc. Basically... Out of the question for Slick.
A simpler solution would be to use Slick's particle system to create some moving shapes, and change the way they are emitted based on the game's events and such. For syncing to music, you could simply store a text file describing the timings of various "key moments" in your streamed music (e.g. climax, breakdown). Then, you simply trigger a change in your particles/shapes/etc. if the music playback time passes one of these "key moments".
Making a particle system "flashy" will be up to you. You can try Slick's particle editor tool, or you could write some systems yourself that use some fancy physics (Processing has some tutorials
that might help).