be.hogent.tarsos.dsp
Class AudioDispatcher

java.lang.Object
  extended by be.hogent.tarsos.dsp.AudioDispatcher
All Implemented Interfaces:
java.lang.Runnable

public final class AudioDispatcher
extends java.lang.Object
implements java.lang.Runnable

This class plays a file and sends float arrays to registered AudioProcessor implementors. This class can be used to feed FFT's, pitch detectors, audio players, ... Using a (blocking) audio player it is even possible to synchronize execution of AudioProcessors and sound. This behavior can be used for visualization.

Author:
Joren Six

Constructor Summary
AudioDispatcher(javax.sound.sampled.AudioInputStream stream, int audioBufferSize, int bufferOverlap)
          Create a new dispatcher from a stream.
AudioDispatcher(int audioBufferSize)
           
 
Method Summary
 void addAudioProcessor(AudioProcessor audioProcessor)
          Adds an AudioProcessor to the chain of processors.
 long durationInFrames()
          Returns the length of the stream, expressed in sample frames rather than bytes.
 double durationInSeconds()
          Returns the duration of the stream in seconds.
static AudioDispatcher fromByteArray(byte[] byteArray, javax.sound.sampled.AudioFormat audioFormat, int audioBufferSize, int bufferOverlap)
          Create a stream from an array of bytes and use that to create a new AudioDispatcher.
static AudioDispatcher fromDefaultMicrophone(int audioBufferSize, int bufferOverlap)
          Create a new AudioDispatcher connected to the default microphone.
static AudioDispatcher fromFile(java.io.File audioFile, int size, int overlap)
          Create a stream from a file and use that to create a new AudioDispatcher
static AudioDispatcher fromFloatArray(float[] floatArray, int sampleRate, int audioBufferSize, int bufferOverlap)
          Create a stream from an array of floats and use that to create a new AudioDispatcher.
static AudioDispatcher fromURL(java.net.URL audioStream, int size, int overlap)
          Create a stream from an URL and use that to create a new AudioDispatcher
 javax.sound.sampled.AudioFormat getFormat()
           
 void removeAudioProcessor(AudioProcessor audioProcessor)
          Removes an AudioProcessor to the chain of processors and calls processingFinished.
 void run()
           
 void setStepSizeAndOverlap(int audioBufferSize, int bufferOverlap)
          Set a new step size and overlap size.
 void setZeroPad(boolean zeroPad)
          if zero pad is true then the first buffer is only filled up to buffer size - hop size E.g.
 void skip(double seconds)
          Skip a number of seconds before processing the stream.
 void stop()
          Stops dispatching audio data.
 
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

AudioDispatcher

public AudioDispatcher(javax.sound.sampled.AudioInputStream stream,
                       int audioBufferSize,
                       int bufferOverlap)
                throws javax.sound.sampled.UnsupportedAudioFileException
Create a new dispatcher from a stream.

Parameters:
stream - The stream to read data from.
audioBufferSize - The size of the buffer defines how much samples are processed in one step. Common values are 1024,2048.
bufferOverlap - How much consecutive buffers overlap (in samples). Half of the AudioBufferSize is common (512, 1024) for an FFT.
Throws:
javax.sound.sampled.UnsupportedAudioFileException - If an unsupported format is used.

AudioDispatcher

public AudioDispatcher(int audioBufferSize)
Method Detail

durationInSeconds

public double durationInSeconds()
Returns the duration of the stream in seconds. If the length of the stream can not be determined (e.g. microphone input), it returns a negative number.

Returns:
The duration of the stream in seconds or a negative number.

durationInFrames

public long durationInFrames()
Returns the length of the stream, expressed in sample frames rather than bytes.

Returns:
The length of the stream, expressed in sample frames rather than bytes.

skip

public void skip(double seconds)
Skip a number of seconds before processing the stream.

Parameters:
seconds -

setStepSizeAndOverlap

public void setStepSizeAndOverlap(int audioBufferSize,
                                  int bufferOverlap)
Set a new step size and overlap size. Both in number of samples. Watch out with this method: it should be called after a batch of samples is processed, not during.

Parameters:
audioBufferSize - The size of the buffer defines how much samples are processed in one step. Common values are 1024,2048.
bufferOverlap - How much consecutive buffers overlap (in samples). Half of the AudioBufferSize is common (512, 1024) for an FFT.

setZeroPad

public void setZeroPad(boolean zeroPad)
if zero pad is true then the first buffer is only filled up to buffer size - hop size E.g. if the buffer is 2048 and the hop size is 48 then you get 2000x0 and 48 filled audio samples

Parameters:
zeroPad - true if the buffer should be zeropadded, false otherwise.

addAudioProcessor

public void addAudioProcessor(AudioProcessor audioProcessor)
Adds an AudioProcessor to the chain of processors.

Parameters:
audioProcessor - The AudioProcessor to add.

removeAudioProcessor

public void removeAudioProcessor(AudioProcessor audioProcessor)
Removes an AudioProcessor to the chain of processors and calls processingFinished.

Parameters:
audioProcessor - The AudioProcessor to add.

run

public void run()
Specified by:
run in interface java.lang.Runnable

stop

public void stop()
Stops dispatching audio data.


getFormat

public javax.sound.sampled.AudioFormat getFormat()

fromFile

public static AudioDispatcher fromFile(java.io.File audioFile,
                                       int size,
                                       int overlap)
                                throws javax.sound.sampled.UnsupportedAudioFileException,
                                       java.io.IOException
Create a stream from a file and use that to create a new AudioDispatcher

Parameters:
audioFile - The file.
size - The number of samples used in the buffer.
overlap -
Returns:
A new audioprocessor.
Throws:
javax.sound.sampled.UnsupportedAudioFileException - If the audio file is not supported.
java.io.IOException - When an error occurs reading the file.

fromURL

public static AudioDispatcher fromURL(java.net.URL audioStream,
                                      int size,
                                      int overlap)
                               throws javax.sound.sampled.UnsupportedAudioFileException,
                                      java.io.IOException
Create a stream from an URL and use that to create a new AudioDispatcher

Parameters:
audioStream - The URL describing the stream..
size - The number of samples used in the buffer.
overlap -
Returns:
A new audio processor.
Throws:
javax.sound.sampled.UnsupportedAudioFileException - If the audio file is not supported.
java.io.IOException - When an error occurs reading the file.

fromByteArray

public static AudioDispatcher fromByteArray(byte[] byteArray,
                                            javax.sound.sampled.AudioFormat audioFormat,
                                            int audioBufferSize,
                                            int bufferOverlap)
                                     throws javax.sound.sampled.UnsupportedAudioFileException
Create a stream from an array of bytes and use that to create a new AudioDispatcher.

Parameters:
byteArray - An array of bytes, containing audio information.
audioFormat - The format of the audio represented using the bytes.
audioBufferSize - The size of the buffer defines how much samples are processed in one step. Common values are 1024,2048.
bufferOverlap - How much consecutive buffers overlap (in samples). Half of the AudioBufferSize is common.
Returns:
A new AudioDispatcher.
Throws:
javax.sound.sampled.UnsupportedAudioFileException - If the audio format is not supported.

fromDefaultMicrophone

public static AudioDispatcher fromDefaultMicrophone(int audioBufferSize,
                                                    int bufferOverlap)
                                             throws javax.sound.sampled.UnsupportedAudioFileException,
                                                    javax.sound.sampled.LineUnavailableException
Create a new AudioDispatcher connected to the default microphone. The default is defined by the Java runtime by calling
AudioSystem.getTargetDataLine(format)
. The microphone must support the format: 44100Hz sample rate, 16bits mono, signed big endian.

Parameters:
audioBufferSize - The size of the buffer defines how much samples are processed in one step. Common values are 1024,2048.
bufferOverlap - How much consecutive buffers overlap (in samples). Half of the AudioBufferSize is common.
Returns:
An audio dispatcher connected to the default microphone.
Throws:
javax.sound.sampled.UnsupportedAudioFileException
javax.sound.sampled.LineUnavailableException

fromFloatArray

public static AudioDispatcher fromFloatArray(float[] floatArray,
                                             int sampleRate,
                                             int audioBufferSize,
                                             int bufferOverlap)
                                      throws javax.sound.sampled.UnsupportedAudioFileException
Create a stream from an array of floats and use that to create a new AudioDispatcher.

Parameters:
floatArray - An array of floats, containing audio information.
sampleRate - The sample rate of the audio information contained in the buffer.
audioBufferSize - The size of the buffer defines how much samples are processed in one step. Common values are 1024,2048.
bufferOverlap - How much consecutive buffers overlap (in samples). Half of the AudioBufferSize is common.
Returns:
A new AudioDispatcher.
Throws:
javax.sound.sampled.UnsupportedAudioFileException - If the audio format is not supported.