arrow-left

All pages
gitbookPowered by GitBook
1 of 54

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Flutter Sound Helpers API

getLastFFmpegCommandOutput()

hashtag
getLastFFmpegCommandOutput()

  • Dart API: getLastFFmpegCommandOutput()arrow-up-right

This simple verb is used to get the output of the last FFmpeg command

Example:

</div>


        print( await getLastFFmpegCommandOutput() );

        Lorem ipsum ...
Dart
Javascript

The τ Player API

resumePlayer()

hashtag
resumePlayer()

  • Dart API: resumePlayer()arrow-up-right.

Use this verbe to resume the current playback. An exception is thrown if the player is not in the "paused" state.

Example:

</div>

The τ Player API

startPlayerFromTrack().

hashtag
startPlayerFromTrack()

  • Dart API: startPlayerFromTrack()arrow-up-right.

Use this verb to play data from a track specification and display controls on the lock screen or an Apple Watch. The Audio Session must have been open with the parameter withUI.

  • track parameter is a simple structure which describe the sound to play. Please see

  • whenFinished:() : A function for specifying what to do when the playback will be finished.

  • onPaused:()

startPlayerFromTrack() returns a Duration Future, which is the record duration.

Example:

</div>

The τ Player API

setAudioFocus.

hashtag
setAudioFocus()

  • Dart API: setAudioFocusarrow-up-right.

hashtag
focus: parameter possible values are

  • AudioFocus.requestFocus (request focus, but do not do anything special with others App)

  • AudioFocus.requestFocusAndStopOthers (your app will have exclusive use of the output audio)

  • AudioFocus.requestFocusAndDuckOthers (if another App like Spotify use the output audio, its volume will be lowered)

hashtag
Other parameters :

Please look to to understand the meaning of the other parameters

Example:

</div>


await myPlayer.resumePlayer();

        Lorem ipsum ...
Dart
Javascript
: this parameter can be :
  • a call back function to call when the user hit the Skip Pause button on the lock screen

  • null : The pause button will be handled by Flutter Sound internal

  • onSkipForward:() : this parameter can be :

    • a call back function to call when the user hit the Skip Forward button on the lock screen

    • null : The Skip Forward button will be disabled

  • onSkipBackward:() : this parameter can be :

    • a call back function to call when the user hit the Skip Backward button on the lock screen

    • : The Skip Backward button will be disabled

  • removeUIWhenStopped : is a boolean to specify if the UI on the lock screen must be removed when the sound is finished or when the App does a stopPlayer(). Most of the time this parameter must be true. It is used only for the rare cases where the App wants to control the lock screen between two playbacks. Be aware that if the UI is not removed, the button Pause/Resume, Skip Backward and Skip Forward remain active between two playbacks. If you want to disable those button, use the API verb nowPlaying(). Remark: actually this parameter is implemented only on iOS.

  • defaultPauseResume : is a boolean value to specify if Flutter Sound must pause/resume the playback by itself when the user hit the pause/resume button. Set this parameter to FALSE if the App wants to manage itself the pause/resume button. If you do not specify this parameter and the onPaused parameter is specified then Flutter Sound will assume FALSE. If you do not specify this parameter and the onPaused parameter is not specified then Flutter Sound will assume TRUE. Remark: actually this parameter is implemented only on iOS.

  • here the Track structure specificationarrow-up-right
    Dart
    Javascript

    AudioFocus.requestFocusAndKeepOthers (your App will play sound above others App)

  • AudioFocus.requestFocusAndInterruptSpokenAudioAndMixWithOthers

  • AudioFocus.requestFocusTransient (for Android)

  • AudioFocus.requestFocusTransientExclusive (for Android)

  • AudioFocus.abandonFocus (Your App will not have anymore the audio focus)

  • openAudioSession()arrow-up-right
    Dart
    Javascript
    
        final fileUri = "https://file-examples.com/wp-content/uploads/2017/11/file_example_MP3_700KB.mp3";
        Track track = Track( codec: Codec.opusOGG, trackPath: fileUri, trackAuthor: '3 Inches of Blood', trackTitle: 'Axes of Evil', albumArtAsset: albumArt )
        Duration d = await myPlayer.startPlayerFromTrack
        (
                    track,
                    whenFinished: ()
                    {
                             print( 'I hope you enjoyed listening to this song' );
                    },
        );
    
            Lorem ipsum ...
    
            myPlayer.setAudioFocus(focus: AudioFocus.requestFocusAndDuckOthers);
    
            Lorem ipsum ...

    tau-api

    player

    The τ Player API

    foodSink.

    hashtag
    foodSink

    • Dart API: foodSinkarrow-up-right.

    The sink side of the Food Controller that you use when you want to play asynchronously live data. This StreamSink accept two kinds of objects :

    • FoodData (the buffers that you want to play)

    • FoodEvent (a call back to be called after a resynchronisation)

    Example:

    shows how to play Live data, without Back Pressure from Flutter Sound

    </div>

    The τ Player API

    onProgress.

    hashtag
    onProgress

    • Dart API: onProgressarrow-up-right.

    The stream side of the Food Controller : this is a stream on which FlutterSound will post the player progression. You may listen to this Stream to have feedback on the current playback.

    PlaybackDisposition has two fields :

    • Duration duration (the total playback duration)

    • Duration position (the current playback position)

    Example:

    </div>

    The τ Player API

    nowPlaying()

    hashtag
    nowPlaying()

    • Dart API: nowPlaying()arrow-up-right.

    This verb is used to set the Lock screen fields without starting a new playback. The fields 'dataBuffer' and 'trackPath' of the Track parameter are not used. Please refer to 'startPlayerFromTrack' for the meaning of the others parameters. Remark setUIProgressBar() is implemented only on iOS.

    Example:

    </div>

    The τ Player API

    pausePlayer()

    hashtag
    pausePlayer()

    • Dart API: pausePlayer()arrow-up-right.

    Use this verbe to pause the current playback. An exception is thrown if the player is not in the "playing" state.

    Example:

    </div>

    The τ Player API

    seekToPlayer()

    hashtag
    seekToPlayer()

    • Dart API: seekToPlayer()arrow-up-right.

    To seek to a new location. The player must already be playing or paused. If not, an exception is thrown.

    Example:

    </div>

    The τ Player API

    startPlayer().

    hashtag
    startPlayer()

    • Dart API: startPlayer()arrow-up-right.

    You can use startPlayer to play a sound.

    • startPlayer() has three optional parameters, depending on your sound source :

      • fromUri: (if you want to play a file or a remote URI)

    You must specify one or the three parameters : fromUri, fromDataBuffer, fromStream.

    • You use the optional parametercodec: for specifying the audio and file format of the file. Please refer to the to know which codecs are currently supported.

    • whenFinished:() : A lambda function for specifying what to do when the playback will be finished.

    Very often, the codec: parameter is not useful. Flutter Sound will adapt itself depending on the real format of the file provided. But this parameter is necessary when Flutter Sound must do format conversion (for example to play opusOGG on iOS).

    startPlayer() returns a Duration Future, which is the record duration.

    Hint: can be useful if you want to get access to some directories on your device.

    Example:

    </div>

    Example:

    </div>

    The τ Player API

    setUIProgressBar()

    hashtag
    setUIProgressBar()

    • Dart API: setUIProgressBar()arrow-up-right.

    This verb is used if the App wants to control itself the Progress Bar on the lock screen. By default, this progress bar is handled automaticaly by Flutter Sound. Remark setUIProgressBar() is implemented only on iOS.

    Example:

    </div>

    
    await myPlayer.startPlayerFromStream(codec: Codec.pcm16, numChannels: 1, sampleRate: 48000);
    
    myPlayer.foodSink.add(FoodData(aBuffer));
    myPlayer.foodSink.add(FoodData(anotherBuffer));
    myPlayer.foodSink.add(FoodData(myOtherBuffer));
    myPlayer.foodSink.add(FoodEvent((){_mPlayer.stopPlayer();}));
    This examplearrow-up-right
    Dart
    Javascript
    
            _playerSubscription = myPlayer.onProgress.listen((e)
            {
                    Duration maxDuration = e.duration;
                    Duration position = e.position;
                    ...
            }
    Dart
    Javascript
    
        Track track = Track( codec: Codec.opusOGG, trackPath: fileUri, trackAuthor: '3 Inches of Blood', trackTitle: 'Axes of Evil', albumArtAsset: albumArt );
        await nowPlaying(Track);
    
            Lorem ipsum ...
    Dart
    Javascript
    
    await myPlayer.pausePlayer();
    
            Lorem ipsum ...
    Dart
    Javascript
    
    await myPlayer.seekToPlayer(Duration(milliseconds: milliSecs));
    
            Lorem ipsum ...
    Dart
    Javascript
    fromDataBuffer:
    (if you want to play from a data buffer)
  • sampleRate is mandatory if codec == Codec.pcm16. Not used for other codecs.

  • Codec compatibility Tablearrow-up-right
    path_providerarrow-up-right
    Dart
    Javascript
    Dart
    Javascript
    
    
            Duration progress = (await getProgress())['progress'];
            Duration duration = (await getProgress())['duration'];
            setUIProgressBar(progress: Duration(milliseconds: progress.milliseconds - 500), duration: duration)
    
            Lorem ipsum ...
    Dart
    Javascript
    
            Lorem ipsum ...
    
            Lorem ipsum ...
    
            Directory tempDir = await getTemporaryDirectory();
            File fin = await File ('${tempDir.path}/flutter_sound-tmp.aac');
            Duration d = await myPlayer.startPlayer(fin.path, codec: Codec.aacADTS);
    
            _playerSubscription = myPlayer.onProgress.listen((e)
            {
                    // ...
            });
    
            Lorem ipsum ...
    
        final fileUri = "https://file-examples.com/wp-content/uploads/2017/11/file_example_MP3_700KB.mp3";
    
        Duration d = await myPlayer.startPlayer
        (
                    fromURI: fileUri,
                    codec: Codec.mp3,
                    whenFinished: ()
                    {
                             print( 'I hope you enjoyed listening to this song' );
                    },
        );
    
            Lorem ipsum ...

    The τ Player API

    setVolume()

    hashtag
    setVolume()

    • Dart API: setVolume()arrow-up-right.

    The parameter is a floating point number between 0 and 1. Volume can be changed when player is running. Manage this after player starts.

    Example:

    </div>

    The τ Player API

    Food.

    hashtag
    Food

    • Dart API: foodarrow-up-right.

    • Dart API: .

    • Dart API: .

    This are the objects that you can add to foodSink The Food class has two others inherited classes :

    • FoodData (the buffers that you want to play)

    • FoodEvent (a call back to be called after a resynchronisation)

    Example:

    shows how to play Live data, without Back Pressure from Flutter Sound

    </div>

    The τ Player API

    isDecoderSupported()

    hashtag
    isDecoderSupported()

    • Dart API: isDecoderSupported()arrow-up-right.

    This verb is useful to know if a particular codec is supported on the current platform. Returns a Future.

    Example:

    </div>

    The τ Player API

    The &tau; player API.

    hashtag
    Creating the Player instance.

    • Dart API: constructorarrow-up-right.

    This is the first thing to do, if you want to deal with playbacks. The instanciation of a new player does not do many thing. You are safe if you put this instanciation inside a global or instance variable initialization.

    Example:

    </div>

    The τ Player API

    stopPlayer()

    hashtag
    stopPlayer()

    • Dart API: stopPlayer()arrow-up-right.

    Use this verb to stop a playback. This verb never throw any exception. It is safe to call it everywhere, for example when the App is not sure of the current Audio State and want to recover a clean reset state.

    Example:

    </div>

    The τ Player API

    getPlayerState()

    hashtag
    playerState, isPlaying, isPaused, isStopped. getPlayerState()

    • Dart API: .

    • Dart API: .

    • Dart API: .

    • Dart API: .

    • Dart API: .

    This four verbs is used when the app wants to get the current Audio State of the player.

    playerState is an attribut which can have the following values :

    • isStopped /// Player is stopped

    • isPlaying /// Player is playing

    • isPaused /// Player is paused

    Flutter Sound shows in the playerState attribut the last known state. When the Audio State of the background OS engine changes, the playerState parameter is not updated exactly at the same time. If you want the exact background OS engine state you must use PlayerState theState = await myPlayer.getPlayerState(). Acutually getPlayerState() is only implemented on iOS.

    Example:

    </div>

    
    await myPlayer.setVolume(0.1);
    
            Lorem ipsum ...
    Dart
    Javascript
    foodDataarrow-up-right
    foodarrow-up-right
    This examplearrow-up-right
    Dart
    Javascript
    
             if ( await myPlayer.isDecoderSupported(Codec.opusOGG) ) doSomething;
    
            Lorem ipsum ...
    Dart
    Javascript
    
            FlutterSoundPlayer myPlayer = FlutterSoundPlayer();
    
            Lorem ipsum ...
    Dart
    Javascript
    
            await myPlayer.stopPlayer();
            if (_playerSubscription != null)
            {
                    _playerSubscription.cancel();
                    _playerSubscription = null;
            }
    
            Lorem ipsum ...
    Dart
    Javascript
    isPlaying is a boolean attribut which is true when the player is in the "Playing" mode.
  • isPaused is a boolean atrribut which is true when the player is in the "Paused" mode.

  • isStopped is a boolean atrribut which is true when the player is in the "Stopped" mode.

  • getPlayerState()arrow-up-right
    isPlayingarrow-up-right
    isPausedarrow-up-right
    isStoppedarrow-up-right
    playerStatearrow-up-right
    Dart
    Javascript
    
    await myPlayer.startPlayerFromStream(codec: Codec.pcm16, numChannels: 1, sampleRate: 48000);
    
    myPlayer.foodSink.add(FoodData(aBuffer));
    myPlayer.foodSink.add(FoodData(anotherBuffer));
    myPlayer.foodSink.add(FoodData(myOtherBuffer));
    myPlayer.foodSink.add(FoodEvent(()async {await _mPlayer.stopPlayer(); setState((){});}));
    
            Lorem ipsum ...
    
            swtich(myPlayer.playerState)
            {
                    case PlayerState.isPlaying: doSomething; break;
                    case PlayerState.isStopped: doSomething; break;
                    case PlayerState.isPaused: doSomething; break;
            }
            ...
            if (myPlayer.isStopped) doSomething;
            if (myPlayer.isPlaying) doSomething;
            if (myPlayer.isPaused) doSomething;
            ...
            PlayerState theState = await myPlayer.getPlayerState();
            ...
    
            Lorem ipsum ...

    The τ Player API

    `openAudioSession()` and `closeAudioSession()`.

    hashtag
    openAudioSession() and closeAudioSession()

    • Dart API: openAudioSessionarrow-up-right.

    • Dart API: .

    A player must be opened before used. A player correspond to an Audio Session. With other words, you must open the Audio Session before using it. When you have finished with a Player, you must close it. With other words, you must close your Audio Session. Opening a player takes resources inside the OS. Those resources are freed with the verb closeAudioSession(). It is safe to call this procedure at any time.

    • If the Player is not open, this verb will do nothing

    • If the Player is currently in play or pause mode, it will be stopped before.

    hashtag
    focus: parameter

    focus is an optional parameter can be specified during the opening : the Audio Focus. This parameter can have the following values :

    • AudioFocus.requestFocusAndStopOthers (your app will have exclusive use of the output audio)

    • AudioFocus.requestFocusAndDuckOthers (if another App like Spotify use the output audio, its volume will be lowered)

    • AudioFocus.requestFocusAndKeepOthers (your App will play sound above others App)

    The Audio Focus is abandoned when you close your player. If your App must play several sounds, you will probably open your player just once, and close it when you have finished with the last sound. If you close and reopen an Audio Session for each sound, you will probably get unpleasant things for the ears with the Audio Focus.

    hashtag
    category

    category is an optional parameter used only on iOS. This parameter can have the following values :

    • ambient

    • multiRoute

    • playAndRecord

    See to understand the meaning of this parameter.

    hashtag
    mode

    mode is an optional parameter used only on iOS. This parameter can have the following values :

    • modeDefault

    • modeGameChat

    • modeMeasurement

    See to understand the meaning of this parameter.

    hashtag
    audioFlags

    are a set of optional flags (used on iOS):

    • outputToSpeaker

    • allowHeadset

    • allowEarPiece

    hashtag
    device

    is the output device (used on Android)

    • speaker

    • headset,

    • earPiece,

    hashtag
    withUI

    is a boolean that you set to true if you want to control your App from the lock-screen (using during your Audio Session).

    You MUST ensure that the player has been closed when your widget is detached from the UI. Overload your widget's dispose() method to closeAudioSession the player when your widget is disposed. In this way you will reset the player and clean up the device resources, but the player will be no longer usable.

    You may not open many Audio Sessions without closing them. You will be very bad if you try something like :

    openAudioSession() and closeAudioSession() return Futures. You may not use your Player before the end of the initialization. So probably you will await the result of openAudioSession(). This result is the Player itself, so that you can collapse instanciation and initialization together with myPlayer = await FlutterSoundPlayer().openAudioSession();

    Example:

    </div>

    recorder

    AudioFocus.requestFocusAndInterruptSpokenAudioAndMixWithOthers (for Android)

  • AudioFocus.requestFocusTransient (for Android)

  • AudioFocus.requestFocusTransientExclusive (for Android)

  • AudioFocus.doNotRequestFocus (useful if you want to mangage yourself the Audio Focus with the verb setAudioFocus())

  • playback
  • record

  • soloAmbient

  • audioProcessing

  • modeMoviePlayback
  • modeSpokenAudio

  • modeVideoChat

  • modeVideoRecording

  • modeVoiceChat

  • modeVoicePrompt

  • allowBlueTooth
  • allowAirPlay

  • allowBlueToothA2DP

  • blueTooth,
  • blueToothA2DP,

  • airPlay

  • closeAudioSessionarrow-up-right
    iOS documentationarrow-up-right
    iOS documentationarrow-up-right
    startPlayerFromTrack()arrow-up-right
    Dart
    Javascript
    @override
    void dispose()
    {
            if (myPlayer != null)
            {
                myPlayer.closeAudioSession();
                myPlayer = null;
            }
            super.dispose();
    }
        while (aCondition)  // *DON'T DO THAT*
        {
                flutterSound = FlutterSoundPlayer().openAudioSession(); // A **new** Flutter Sound instance is created and opened
                flutterSound.startPlayer(bipSound);
        }
    
        myPlayer = await FlutterSoundPlayer().openAudioSession(focus: Focus.requestFocusAndDuckOthers, outputToSpeaker | allowBlueTooth);
    
        ...
        (do something with myPlayer)
        ...
    
        await myPlayer.closeAudioSession();
        myPlayer = null;
        FlutterSoundPlayer myPlayer = FlutterSoundPlayer();
    
            Lorem ipsum ...

    The τ Player API

    getProgress()

    hashtag
    getProgress()

    • Dart API: getProgress()arrow-up-right.

    This verb is used to get the current progress of a playback. It returns a Map with two Duration entries : 'progress' and 'duration'. Remark : actually only implemented on iOS.

    Example:

    </div>

    The τ Player API

    startPlayerFromStream().

    hashtag
    startPlayerFromStream()

    • Dart API: startPlayerFromStream()arrow-up-right.

    This functionnality needs, at least, and Android SDK >= 21

    • The only codec supported is actually Codec.pcm16.

    • The only value possible for numChannels is actually 1.

    • SampleRate is the sample rate of the data you want to play.

    Please look to

    Example You can look to the three provided examples :

    • shows how to play Live data, with Back Pressure from Flutter Sound

    • shows how to play Live data, without Back Pressure from Flutter Sound

    • shows how to play some real time sound effects.

    Example 1:

    </div>

    Example 2:

    </div>

    The τ Player API

    feedFromStream().

    hashtag
    feedFromStream()

    • Dart API: feedFromStream()arrow-up-right.

    This is the verb that you use when you want to play live PCM data synchronously. This procedure returns a Future. It is very important that you wait that this Future is completed before trying to play another buffer.

    Example:

    • shows how to play Live data, with Back Pressure from Flutter Sound

    • shows how to play some real time sound effects synchronously.

    </div>

    The τ Recorder API

    setAudioFocus()

    hashtag
    -

    hashtag
    setAudioFocus()

    • Dart API:

    hashtag
    focus: parameter possible values are

    • AudioFocus.requestFocus (request focus, but do not do anything special with others App)

    • AudioFocus.requestFocusAndStopOthers (your app will have exclusive use of the output audio)

    • AudioFocus.requestFocusAndDuckOthers (if another App like Spotify use the output audio, its volume will be lowered)

    hashtag
    Other parameters :

    Please look to to understand the meaning of the other parameters

    Example:

    </div>

    The τ Recorder API

    `recorderState`, `isRecording`, `isPaused`, `isStopped`.

    hashtag
    recorderState, isRecording, isPaused, isStopped

    • Dart API:

    • Dart API:

    • Dart API:

    • Dart API:

    This four attributs is used when the app wants to get the current Audio State of the recorder.

    recorderState is an attribut which can have the following values :

    • isStopped /// Recorder is stopped

    • isRecording /// Recorder is recording

    • isPaused /// Recorder is paused

    Example:

    </div>

    The τ Recorder API

    pauseRecorder()

    hashtag
    pauseRecorder()

    • Dart API: pauseRecorderarrow-up-right

    On Android this API verb needs al least SDK-24. An exception is thrown if the Recorder is not currently recording.

    Example:

    </div>

    Flutter Sound Helpers API

    constructor

    hashtag
    Module instanciation

    • Dart API: constructorarrow-up-right

    You do not need to instanciate the Flutter Sound Helper module. To use this module, you can just use the singleton offers by the module : flutterSoundHelper.

    Example:

    </div>

    The τ Recorder API

    startRecorder()

    hashtag
    startRecorder()

    • Dart API:

    You use

    The τ Player API

    setSubscriptionDuration()

    hashtag
    setSubscriptionDuration()

    • Dart API: .

    
            Duration progress = (await getProgress())['progress'];
            Duration duration = (await getProgress())['duration'];
    
            Lorem ipsum ...
    Dart
    Javascript
    the following noticearrow-up-right
    This examplearrow-up-right
    This examplearrow-up-right
    This examplearrow-up-right
    Dart
    Javascript
    Dart
    Javascript
    This examplearrow-up-right
    This examplearrow-up-right
    Dart
    Javascript

    AudioFocus.requestFocusAndKeepOthers (your App will play sound above others App)

  • AudioFocus.requestFocusAndInterruptSpokenAudioAndMixWithOthers

  • AudioFocus.requestFocusTransient (for Android)

  • AudioFocus.requestFocusTransientExclusive (for Android)

  • AudioFocus.abandonFocus (Your App will not have anymore the audio focus)

  • setAudioFocusarrow-up-right
    openAudioSession()arrow-up-right
    Dart
    Javascript
    isRecording is a boolean attribut which is true when the recorder is in the "Recording" mode.
  • isPaused is a boolean atrribut which is true when the recorder is in the "Paused" mode.

  • isStopped is a boolean atrribut which is true when the recorder is in the "Stopped" mode.

  • recorderStatearrow-up-right
    isRecordingarrow-up-right
    isPausedarrow-up-right
    isStoppedarrow-up-right
    Dart
    Javascript
    
            await myRecorder.pauseRecorder();
    
            Lorem ipsum ...
    Dart
    Javascript
    
            Duration t = await flutterSoundHelper.duration(aPathFile);
    
            Lorem ipsum ...
    Dart
    Javascript
    This verb is used to change the default interval between two post on the "Update Progress" stream. (The default interval is 0 (zero) which means "NO post")

    Example:

    Dart

    Javascript

    </div>

    setSubscriptionDuration()arrow-up-right
    
            myPlayer.setSubscriptionDuration(Duration(milliseconds: 100));
    
            Lorem ipsum ...
    
    await myPlayer.startPlayerFromStream(codec: Codec.pcm16, numChannels: 1, sampleRate: 48000);
    
    await myPlayer.feedFromStream(aBuffer);
    await myPlayer.feedFromStream(anotherBuffer);
    await myPlayer.feedFromStream(myOtherBuffer);
    
    await myPlayer.stopPlayer();
        );
    
            Lorem ipsum ...
    
    await myPlayer.startPlayerFromStream(codec: Codec.pcm16, numChannels: 1, sampleRate: 48000);
    
    myPlayer.foodSink.add(FoodData(aBuffer));
    myPlayer.foodSink.add(FoodData(anotherBuffer));
    myPlayer.foodSink.add(FoodData(myOtherBuffer));
    
    myPlayer.foodSink.add(FoodEvent((){_mPlayer.stopPlayer();}));
    
            Lorem ipsum ...
    
    await myPlayer.startPlayerFromStream(codec: Codec.pcm16, numChannels: 1, sampleRate: 48000);
    
    await myPlayer.feedFromStream(aBuffer);
    await myPlayer.feedFromStream(anotherBuffer);
    await myPlayer.feedFromStream(myOtherBuffer);
    
    await myPlayer.stopPlayer();
    
            Lorem ipsum ...
    
            myPlayer.setAudioFocus(focus: AudioFocus.requestFocusAndDuckOthers);
    
            Lorem ipsum ...
    
            switch(myRecorder.recorderState)
            {
                    case RecorderState.isRecording: doSomething; break;
                    case RecorderState.isStopped: doSomething; break;
                    case RecorderState.isPaused: doSomething; break;
            }
            ...
            if (myRecorder.isStopped) doSomething;
            if (myRecorder.isRecording) doSomething;
            if (myRecorder.isPaused) doSomething;
    
            Lorem ipsum ...
    startRecorder()
    to start recording in an open session.
    startRecorder()
    has the destination file path as parameter. It has also 7 optional parameters to specify :
    • codec: The codec to be used. Please refer to the Codec compatibility Tablearrow-up-right to know which codecs are currently supported.

    • toFile: a path to the file being recorded

    • toStream: if you want to record to a Dart Stream. Please look to the following noticearrow-up-right. This new functionnality needs, at least, Android SDK >= 21 (23 is better)

    • sampleRate: The sample rate in Hertz

    • numChannels: The number of channels (1=monophony, 2=stereophony)

    • bitRate: The bit rate in Hertz

    • audioSource : possible value is :

      • defaultSource

      • microphone

    path_providerarrow-up-right can be useful if you want to get access to some directories on your device.

    Flutter Sound does not take care of the recording permission. It is the App responsability to check or require the Recording permission. Permission_handlerarrow-up-right is probably useful to do that.

    Example:

    Dart

    Javascript

    </div>

    startRecorderarrow-up-right
    
        // Request Microphone permission if needed
        PermissionStatus status = await Permission.microphone.request();
        if (status != PermissionStatus.granted)
                throw RecordingPermissionException("Microphone permission not granted");
    
        Directory tempDir = await getTemporaryDirectory();
        File outputFile = await File ('${tempDir.path}/flutter_sound-tmp.aac');
        await myRecorder.startRecorder(toFile: outputFile.path, codec: t_CODEC.CODEC_AAC,);
    
            Lorem ipsum ...
    voiceDownlink (if someone can explain me what it is, I will be grateful ;-) )

    utilities

    The τ Recorder API

    onProgress

    hashtag
    onProgress

    • Dart API: onProgressarrow-up-right

    The attribut onProgress is a stream on which FlutterSound will post the recorder progression. You may listen to this Stream to have feedback on the current recording.

    Example:

    </div>

    The τ Recorder API

    `openAudioSession()` and `closeAudioSession()`

    hashtag
    openAudioSession() and closeAudioSession()

    • Dart API: openAudioSessionarrow-up-right

    • Dart API:

    A recorder must be opened before used. A recorder correspond to an Audio Session. With other words, you must open the Audio Session before using it. When you have finished with a Recorder, you must close it. With other words, you must close your Audio Session. Opening a recorder takes resources inside the OS. Those resources are freed with the verb closeAudioSession().

    You MUST ensure that the recorder has been closed when your widget is detached from the UI. Overload your widget's dispose() method to close the recorder when your widget is disposed. In this way you will reset the player and clean up the device resources, but the recorder will be no longer usable.

    You maynot openAudioSession many recorders without releasing them. You will be very bad if you try something like :

    openAudioSession() and closeAudioSession() return Futures. You may not use your Recorder before the end of the initialization. So probably you will await the result of openAudioSession(). This result is the Recorder itself, so that you can collapse instanciation and initialization together with myRecorder = await FlutterSoundPlayer().openAudioSession();

    The four optional parameters are used if you want to control the Audio Focus. Please look to to understand the meaning of those parameters

    Example:

    </div>

    The τ Recorder API

    isEncoderSupported()

    hashtag
    isEncoderSupported()

    • Dart API: isEncoderSupportedarrow-up-right

    This verb is useful to know if a particular codec is supported on the current platform; Return a Future.

    Example:

    </div>

    Flutter Sound Helpers API

    waveToPCMBuffer()

    hashtag
    waveToPCMBuffer()

    • Dart API: waveToPCMBuffer()arrow-up-right

    This verb is usefull to convert a Wave buffer to a Raw PCM buffer. Note that this verb is not asynchronous and does not return a Future.

    It removes the Wave envelop from the PCM buffer.

    Example:

    </div>

    The τ Recorder API

    resumeRecorder()

    hashtag
    resumeRecorder()

    • Dart API: resumeRecorderarrow-up-right

    On Android this API verb needs al least SDK-24. An exception is thrown if the Recorder is not currently paused.

    Example:

    </div>

    The τ Recorder API

    stopRecorder()

    hashtag
    stopRecorder()

    • Dart API: stopRecorderarrow-up-right

    Use this verb to stop a record. This verb never throws any exception. It is safe to call it everywhere, for example when the App is not sure of the current Audio State and want to recover a clean reset state.

    Example:

    </div>

    Flutter Sound Helpers API

    waveToPCM()

    hashtag
    waveToPCM()

    • Dart API:

    This verb is usefull to convert a Wave file to a Raw PCM file.

    Flutter Sound Helpers API

    isFFmpegAvailable()

    hashtag
    isFFmpegAvailable()

    • Dart API:

    Flutter Sound Helpers API

    getLastFFmpegReturnCode()

    hashtag
    getLastFFmpegReturnCode()

    • Dart API:

    The τ Recorder API

    setSubscriptionDuration()

    hashtag
    setSubscriptionDuration()

    • Dart API:

    
            _recorderSubscription = myrecorder.onProgress.listen((e)
            {
                    Duration maxDuration = e.duration;
                    double decibels = e.decibels
                    ...
            }
    
            Lorem ipsum ...
    Dart
    Javascript
    closeAudioSessionarrow-up-right
    FlutterSoundPlayer openAudioSession()arrow-up-right
    Dart
    Javascript
    
           if ( await myRecorder.isEncoderSupported(Codec.opusOGG) ) doSomething;
    
            Lorem ipsum ...
    Dart
    Javascript
    
            Uint8List pcmBuffer flutterSoundHelper.waveToPCMBuffer(inputBuffer: aWaveBuffer);
    
            Lorem ipsum ...
    Dart
    Javascript
    
            await myRecorder.resumeRecorder();
    
            Lorem ipsum ...
    Dart
    Javascript
    
            await myRecorder.stopRecorder();
            if (_recorderSubscription != null)
            {
                    _recorderSubscription.cancel();
                    _recorderSubscription = null;
            }
    
            Lorem ipsum ...
    Dart
    Javascript

    It removes the Wave envelop from the PCM file.

    Example:

    Dart

    Javascript

    </div>

    waveToPCM()arrow-up-right
    
            String inputFile = '$myInputPath/bar.pcm';
            var tempDir = await getTemporaryDirectory();
            String outpufFile = '${tempDir.path}/$foo.wav';
            await flutterSoundHelper.waveToPCM(inputFile: inputFile, outpoutFile: outputFile);
    
            Lorem ipsum ...
    This verb is used to know during runtime if FFmpeg is linked with the App.

    Example:

    Dart

    Javascript

    </div>

    isFFmpegAvailable()arrow-up-right
    
            if ( await flutterSoundHelper.isFFmpegAvailable() )
            {
                    Duration d = flutterSoundHelper.duration("$myFilePath/bar.wav");
            }
    
            Lorem ipsum ...
    This simple verb is used to get the result of the last FFmpeg command

    Example:

    Dart

    Javascript

    </div>

    getLastFFmpegReturnCode()arrow-up-right
    
            int result = await getLastFFmpegReturnCode();
    
            Lorem ipsum ...
    This verb is used to change the default interval between two post on the "Update Progress" stream. (The default interval is 0 (zero) which means "NO post")

    Example:

    Dart

    Javascript

    </div>

    setSubscriptionDurationarrow-up-right
    
            // 0 is default
            myRecorder.setSubscriptionDuration(0.010);
    
            Lorem ipsum ...
    @override
    void dispose()
    {
            if (myRecorder != null)
            {
                myRecorder.closeAudioSession();
                myPlayer = null;
            }
            super.dispose();
    }
        while (aCondition)  // *DO'NT DO THAT*
        {
                flutterSound = FlutterSoundRecorder().openAudioSession(); // A **new** Flutter Sound instance is created and opened
                ...
        }
    
        myRecorder = await FlutterSoundRecorder().openAudioSession();
    
        ...
        (do something with myRecorder)
        ...
    
        myRecorder.closeAudioSession();
        myRecorder = null;
    
            Lorem ipsum ...

    Flutter Sound Helpers API

    executeFFmpegWithArguments()

    hashtag
    executeFFmpegWithArguments()

    • Dart API: executeFFmpegWithArguments()arrow-up-right

    This verb is a wrapper for the great FFmpeg application. The command "man ffmpeg" (if you have installed ffmpeg on your computer) will give you many informations. If you do not have ffmpeg on your computer you will find easyly on internet many documentation on this great program.

    Example:

    </div>

    Flutter Sound Helpers API

    ffMpegGetMediaInformation()

    hashtag
    ffMpegGetMediaInformation()

    • Dart API: ffMpegGetMediaInformation()arrow-up-right

    This verb is used to get various informations on a file.

    The informations got with FFmpegGetMediaInformation() are .

    Example:

    </div>

    SoundPlayerUI

    The &tau; UI Widgets.

    hashtag
    How to use

    First import the modules import 'flutter_sound.dart

    The SoundPlayerUI provides a Playback widget styled after the HTML 5 audio player.

    The player displays a loading indicator and allows the user to pause/resume/skip via the progress bar.

    You can also pause/resume the player via an api call to SoundPlayerUI's state using a GlobalKey.

    The SoundPlayerUI widget allows you to playback audio from multiple sources:

    • File

    • Asset

    • URL

    hashtag
    MediaFormat

    When using the SoundPlayerUI you MUST pass a Track that has been initialised with a supported MediaFormat.

    The Widget needs to obtain the duration of the audio that it is play and that can only be done if we know the MediaFormat of the Widget.

    If you pass a Track that wasn't constructed with a MediaFormat then a MediaFormatException will be thrown.

    The MediaFormat must also be natively supported by the OS. See mediaformat.md for additional details on checking for a supported format.

    hashtag
    Example:

    Sounds uses as the primary method of handing around audio data.

    You can also dynamically load a Track when the user clicks the 'Play' button on the SoundPlayerUI widget. This allows you to delay the decision on what Track is going to be played until the user clicks the 'Play' button.

    Flutter Sound Helpers API

    duration()

    hashtag
    duration()

    • Dart API: duration()arrow-up-right

    This verb is used to get an estimation of the duration of a sound file. Be aware that it is just an estimation, based on the Codec used and the sample rate.

    Note : this verb uses FFmpeg and is not available int the LITE flavor of Flutter Sound.

    Example:

    </div>

    
    int rc = await flutterSoundHelper.executeFFmpegWithArguments
     ([
            '-loglevel',
            'error',
            '-y',
            '-i',
            infile,
            '-c:a',
            'copy',
            outfile,
    ]); // remux OGG to CAF
    Dart
    Javascript
    
            print( await getLastFFmpegCommandOutput() );
    documented herearrow-up-right
    Dart
    Javascript
    
            Duration d = flutterSoundHelper.duration("$myFilePath/bar.wav");
    
            Lorem ipsum ...
    Dart
    Javascript
    
            Lorem ipsum ...
    
            Map info = await flutterSoundHelper.FFmpegGetMediaInformation( uri );
    Buffer
    Trackarrow-up-right

    Flutter Sound Helpers API

    pcmToWave()

    hashtag
    pcmToWave()

    • Dart API: pcmToWave()arrow-up-right

    This verb is usefull to convert a Raw PCM file to a Wave file.

    It adds a Wave envelop to the PCM file, so that the file can be played back with startPlayer().

    Note: the parameters numChannels and sampleRate are mandatory, and must match the actual PCM data. a discussion about Raw PCM and WAVE file format.

    Example:

    </div>

    SoundPlayerUI

    UIRecorder

    hashtag
    How to use

    First import the modules import 'flutter_sound.dart

    The SoundPlayerUI provides a Playback widget styled after the HTML 5 audio player.

    The player displays a loading indicator and allows the user to pause/resume/skip via the progress bar.

    You can also pause/resume the player via an api call to SoundPlayerUI's state using a GlobalKey.

    The SoundPlayerUI widget allows you to playback audio from multiple sources:

    • File

    • Asset

    • URL

    hashtag
    MediaFormat

    When using the SoundPlayerUI you MUST pass a Track that has been initialised with a supported MediaFormat.

    The Widget needs to obtain the duration of the audio that it is play and that can only be done if we know the MediaFormat of the Widget.

    If you pass a Track that wasn't constructed with a MediaFormat then a MediaFormatException will be thrown.

    The MediaFormat must also be natively supported by the OS. See mediaformat.md for additional details on checking for a supported format.

    hashtag
    Example:

    Sounds uses as the primary method of handing around audio data.

    You can also dynamically load a Track when the user clicks the 'Play' button on the SoundPlayerUI widget. This allows you to delay the decision on what Track is going to be played until the user clicks the 'Play' button.

    Flutter Sound Helpers API

    pcmToWaveBuffer()

    hashtag
    pcmToWaveBuffer()

    • Dart API: pcmToWaveBuffer()arrow-up-right

    This verb is usefull to convert a Raw PCM buffer to a Wave buffer.

    It adds a Wave envelop in front of the PCM buffer, so that the file can be played back with startPlayerFromBuffer().

    Note: the parameters numChannels and sampleRate are mandatory, and must match the actual PCM data. a discussion about Raw PCM and WAVE file format.

    Example:

    </div>

    Flutter Sound Helpers API

    convertFile()

    hashtag
    convertFile()

    • Dart API: convertFile()arrow-up-right

    This verb is useful to convert a sound file to a new format.

    • infile is the file path of the file you want to convert

    • codecin is the actual file format

    • outfile

    Be careful : outfile and codecout must be compatible. The output file extension must be a correct file extension for the new format.

    Note : this verb uses FFmpeg and is not available int the LITE flavor of Flutter Sound.

    Example:

    </div>

    SoundPlayerUI

    UIPlayer

    hashtag
    How to use

    First import the modules import 'flutter_sound.dart

    The SoundPlayerUI provides a Playback widget styled after the HTML 5 audio player.

    The player displays a loading indicator and allows the user to pause/resume/skip via the progress bar.

    You can also pause/resume the player via an api call to SoundPlayerUI's state using a GlobalKey.

    The SoundPlayerUI widget allows you to playback audio from multiple sources:

    • File

    • Asset

    • URL

    hashtag
    MediaFormat

    When using the SoundPlayerUI you MUST pass a Track that has been initialised with a supported MediaFormat.

    The Widget needs to obtain the duration of the audio that it is play and that can only be done if we know the MediaFormat of the Widget.

    If you pass a Track that wasn't constructed with a MediaFormat then a MediaFormatException will be thrown.

    The MediaFormat must also be natively supported by the OS. See mediaformat.md for additional details on checking for a supported format.

    hashtag
    Example:

    Sounds uses as the primary method of handing around audio data.

    You can also dynamically load a Track when the user clicks the 'Play' button on the SoundPlayerUI widget. This allows you to delay the decision on what Track is going to be played until the user clicks the 'Play' button.

    Track track;
    
    /// global key so we can pause/resume the player via the api.
    var playerStateKey = GlobalKey<SoundPlayerUIState>();
    
    void initState()
    {
       track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    }
    
    Widget build(BuildContext build)
    {
        var player = SoundPlayerUI.fromTrack(track, key: playerStateKey);
        return
            Column(child: [
                player,
                RaisedButton("Pause", onPressed: () => playerState.currentState.pause()),
                RaisedButton("Resume", onPressed: () => playerState.currentState.resume())
            ]);
    }
    Track track;
    
    
    void initState()
    {
       track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    }
    
    Widget build(BuildContext build)
    {
        return SoundPlayerUI.fromLoader((context) => loadTrack());
    }
    
    Future<Track> loadTrack()
    {
        Track track;
        track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    
        track.title = "Asset playback.";
        track.artist = "By sounds";
    }

    ui_widgets

    
            String inputFile = '$myInputPath/bar.pcm';
            var tempDir = await getTemporaryDirectory();
            String outpufFile = '${tempDir.path}/$foo.wav';
            await flutterSoundHelper.pcmToWave(inputFile: inputFile, outpoutFile: outputFile, numChannels: 1, sampleRate: 8000);
    See herearrow-up-right
    Dart
    Javascript
    
            Uint8List myWavBuffer = await flutterSoundHelper.pcmToWaveBuffer(inputBuffer: myPCMBuffer, numChannels: 1, sampleRate: 8000);
    See herearrow-up-right
    Dart
    Javascript
    is the path of the file you want to create
  • codecout is the new file format

  • Dart
    Javascript
    
            Lorem ipsum ...
    
            Lorem ipsum ...
    
            String inputFile = '$myInputPath/bar.wav';
            var tempDir = await getTemporaryDirectory();
            String outpufFile = '${tempDir.path}/$foo.mp3';
            await flutterSoundHelper.convertFile(inputFile, codec.pcm16WAV, outputFile, Codec.mp3)
    
            Lorem ipsum ...
    Buffer
    Trackarrow-up-right
    Buffer
    Trackarrow-up-right

    SoundPlayerUI

    UIController

    hashtag
    How to use

    First import the modules import 'flutter_sound.dart

    The SoundPlayerUI provides a Playback widget styled after the HTML 5 audio player.

    The player displays a loading indicator and allows the user to pause/resume/skip via the progress bar.

    You can also pause/resume the player via an api call to SoundPlayerUI's state using a GlobalKey.

    The SoundPlayerUI widget allows you to playback audio from multiple sources:

    • File

    • Asset

    • URL

    hashtag
    MediaFormat

    When using the SoundPlayerUI you MUST pass a Track that has been initialised with a supported MediaFormat.

    The Widget needs to obtain the duration of the audio that it is play and that can only be done if we know the MediaFormat of the Widget.

    If you pass a Track that wasn't constructed with a MediaFormat then a MediaFormatException will be thrown.

    The MediaFormat must also be natively supported by the OS. See mediaformat.md for additional details on checking for a supported format.

    hashtag
    Example:

    Sounds uses as the primary method of handing around audio data.

    You can also dynamically load a Track when the user clicks the 'Play' button on the SoundPlayerUI widget. This allows you to delay the decision on what Track is going to be played until the user clicks the 'Play' button.

    The Main modules

    The &tau; API.

    τ is composed with 4 modules :

    • FlutterSoundPlayer, wich deal with everything about playbacks

    • FlutterSoundRecorder, which deal with everything about recording

    • FlutterSoundHelper, which offers some convenients tools

    • FlutterSoundUI, which offer some Widget ready to be used out of the box

    To use Flutter Sound you just do :

    This will import all the necessaries dart interfaces.

    hashtag
    Playback

    1. Instance one ore more players. A good place to do that is in your init() function. It is also possible to instanciate the players "on the fly", when needed.

    2. Open it. You cannot do anything on a close Player. An audio-session is then created.

    3. Use the various verbs implemented by the players.

    hashtag
    Recording

    1. Instance your recorder. A good place to do that is in your init() function.

    2. Open it. You cannot do anything on a close Recorder. An audio-session is then created.

    3. Use the various verbs implemented by the players.

    Flutter Sound Helpers API

    The &tau; utilities API.

    hashtag
    Module instanciation

    Dart definition (prototype) :

    You do not need to instanciate the Flutter Sound Helper module. To use this module, you can just use the singleton offers by the module : flutterSoundHelper.

    Example:

    hashtag
    convertFile()

    Dart definition (prototype) :

    This verb is useful to convert a sound file to a new format.

    • infile is the file path of the file you want to convert

    • codecin is the actual file format

    • outfile

    Be careful : outfile and codecout must be compatible. The output file extension must be a correct file extension for the new format.

    Note : this verb uses FFmpeg and is not available int the LITE flavor of Flutter Sound.

    Example:

    hashtag
    pcmToWave()

    Dart definition (prototype) :

    This verb is usefull to convert a Raw PCM file to a Wave file.

    It adds a Wave envelop to the PCM file, so that the file can be played back with startPlayer().

    Note: the parameters numChannels and sampleRate are mandatory, and must match the actual PCM data. a discussion about Raw PCM and WAVE file format.

    Example:

    hashtag
    pcmToWaveBuffer()

    Dart definition (prototype) :

    This verb is usefull to convert a Raw PCM buffer to a Wave buffer.

    It adds a Wave envelop in front of the PCM buffer, so that the file can be played back with startPlayerFromBuffer().

    Note: the parameters numChannels and sampleRate are mandatory, and must match the actual PCM data. a discussion about Raw PCM and WAVE file format.

    Example:

    hashtag
    waveToPCM()

    Dart definition (prototype) :

    This verb is usefull to convert a Wave file to a Raw PCM file.

    It removes the Wave envelop from the PCM file.

    Example:

    hashtag
    waveToPCMBuffer()

    Dart definition (prototype) :

    This verb is usefull to convert a Wave buffer to a Raw PCM buffer. Note that this verb is not asynchronous and does not return a Future.

    It removes the Wave envelop from the PCM buffer.

    Example:

    hashtag
    duration()

    Dart definition (prototype) :

    This verb is used to get an estimation of the duration of a sound file. Be aware that it is just an estimation, based on the Codec used and the sample rate.

    Note : this verb uses FFmpeg and is not available int the LITE flavor of Flutter Sound.

    Example:

    hashtag
    isFFmpegAvailable()

    Dart definition (prototype) :

    This verb is used to know during runtime if FFmpeg is linked with the App.

    Example:

    hashtag
    executeFFmpegWithArguments()

    Dart definition (prototype) :

    This verb is a wrapper for the great FFmpeg application. The command "man ffmpeg" (if you have installed ffmpeg on your computer) will give you many informations. If you do not have ffmpeg on your computer you will find easyly on internet many documentation on this great program.

    Example:

    hashtag
    getLastFFmpegReturnCode()

    Dart definition (prototype) :

    This simple verb is used to get the result of the last FFmpeg command

    Example:

    hashtag
    getLastFFmpegCommandOutput()

    Dart definition (prototype) :

    This simple verb is used to get the output of the last FFmpeg command

    Example:

    hashtag
    FFmpegGetMediaInformation

    Dart definition (prototype) :

    This verb is used to get various informations on a file.

    The informations got with FFmpegGetMediaInformation() are .

    Example:

    Track track;
    
    /// global key so we can pause/resume the player via the api.
    var playerStateKey = GlobalKey<SoundPlayerUIState>();
    
    void initState()
    {
       track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    }
    
    Widget build(BuildContext build)
    {
        var player = SoundPlayerUI.fromTrack(track, key: playerStateKey);
        return
            Column(child: [
                player,
                RaisedButton("Pause", onPressed: () => playerState.currentState.pause()),
                RaisedButton("Resume", onPressed: () => playerState.currentState.resume())
            ]);
    }
    Track track;
    
    
    void initState()
    {
       track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    }
    
    Widget build(BuildContext build)
    {
        return SoundPlayerUI.fromLoader((context) => loadTrack());
    }
    
    Future<Track> loadTrack()
    {
        Track track;
        track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    
        track.title = "Asset playback.";
        track.artist = "By sounds";
    }
    Track track;
    
    /// global key so we can pause/resume the player via the api.
    var playerStateKey = GlobalKey<SoundPlayerUIState>();
    
    void initState()
    {
       track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    }
    
    Widget build(BuildContext build)
    {
        var player = SoundPlayerUI.fromTrack(track, key: playerStateKey);
        return
            Column(child: [
                player,
                RaisedButton("Pause", onPressed: () => playerState.currentState.pause()),
                RaisedButton("Resume", onPressed: () => playerState.currentState.resume())
            ]);
    }
    Track track;
    
    
    void initState()
    {
       track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    }
    
    Widget build(BuildContext build)
    {
        return SoundPlayerUI.fromLoader((context) => loadTrack());
    }
    
    Future<Track> loadTrack()
    {
        Track track;
        track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    
        track.title = "Asset playback.";
        track.artist = "By sounds";
    }
    FlutterSoundHelper flutterSoundHelper = FlutterSoundHelper(); // Singleton
    Buffer
    Trackarrow-up-right

    startPlayer()

  • startPlayerFromStream()

  • startPlayerFromBuffer()

  • setVolume()

  • FlutterSoundPlayer.stopPlayer()

  • ...

  • Close your players.

    This is important to close every player open for freeing the resources taken by the audio session.

    A good place to do that is in the dispose() procedure.

  • startRecorder()

  • pauseRecorder()

  • resumeRecorder()

  • stopRecorder()

  • ...

  • Close your recorder.

    This is important to close it for freeing the resources taken by the audio session.

    A good place to do that is in the dispose() procedure.

  • is the path of the file you want to create
  • codecout is the new file format

  • See herearrow-up-right
    See herearrow-up-right
    documented herearrow-up-right
    Track track;
    
    /// global key so we can pause/resume the player via the api.
    var playerStateKey = GlobalKey<SoundPlayerUIState>();
    
    void initState()
    {
       track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    }
    
    Widget build(BuildContext build)
    {
        var player = SoundPlayerUI.fromTrack(track, key: playerStateKey);
        return
            Column(child: [
                player,
                RaisedButton("Pause", onPressed: () => playerState.currentState.pause()),
                RaisedButton("Resume", onPressed: () => playerState.currentState.resume())
            ]);
    }
    Track track;
    
    
    void initState()
    {
       track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    }
    
    Widget build(BuildContext build)
    {
        return SoundPlayerUI.fromLoader((context) => loadTrack());
    }
    
    Future<Track> loadTrack()
    {
        Track track;
        track = Track.fromAsset('assets/rock.mp3', mediaFormat: Mp3MediaFormat());
    
        track.title = "Asset playback.";
        track.artist = "By sounds";
    }
    myPlayer.closeAudioSession();
    myRecorder.closeAudioSession();
    import 'package:flutter_sound/flutter_sound.dart';
    FlutterSoundPlayer myPlayer = FlutterSoundPlayer();
    myPlayer.openAudioSession().then( (){ ...} );
    FlutterSoundRecorder myRecorder = FlutterSoundRecorder();
    myRecorder.openAudioSession().then( (){ ...} );
    Duration t = await flutterSoundHelper.duration(aPathFile);
    Future<bool> convertFile
    (
            String infile,
            Codec codecin,
            String outfile,
            Codec codecout
    ) async
            String inputFile = '$myInputPath/bar.wav';
            var tempDir = await getTemporaryDirectory();
            String outpufFile = '${tempDir.path}/$foo.mp3';
            await flutterSoundHelper.convertFile(inputFile, codec.pcm16WAV, outputFile, Codec.mp3)
    Future<void> pcmToWave
    (
          {
              String inputFile,
              String outputFile,
              int numChannels,
              int sampleRate,
          }
    ) async
            String inputFile = '$myInputPath/bar.pcm';
            var tempDir = await getTemporaryDirectory();
            String outpufFile = '${tempDir.path}/$foo.wav';
            await flutterSoundHelper.pcmToWave(inputFile: inputFile, outpoutFile: outputFile, numChannels: 1, sampleRate: 8000);
    Future<Uint8List> pcmToWaveBuffer
    (
          {
            Uint8List inputBuffer,
            int numChannels,
            int sampleRate,
          }
    ) async
            Uint8List myWavBuffer = await flutterSoundHelper.pcmToWaveBuffer(inputBuffer: myPCMBuffer, numChannels: 1, sampleRate: 8000);
    Future<void> waveToPCM
    (
          {
              String inputFile,
              String outputFile,
           }
    ) async
            String inputFile = '$myInputPath/bar.pcm';
            var tempDir = await getTemporaryDirectory();
            String outpufFile = '${tempDir.path}/$foo.wav';
            await flutterSoundHelper.waveToPCM(inputFile: inputFile, outpoutFile: outputFile);
    Uint8List waveToPCMBuffer (Uint8List inputBuffer)
            Uint8List pcmBuffer flutterSoundHelper.waveToPCMBuffer(inputBuffer: aWaveBuffer);
     Future<Duration> duration(String uri) async
            Duration d = flutterSoundHelper.duration("$myFilePath/bar.wav");
    Future<bool> isFFmpegAvailable() async
            if ( await flutterSoundHelper.isFFmpegAvailable() )
            {
                    Duration d = flutterSoundHelper.duration("$myFilePath/bar.wav");
            }
    Future<int> executeFFmpegWithArguments(List<String> arguments)
     int rc = await flutterSoundHelper.executeFFmpegWithArguments
     ([
            '-loglevel',
            'error',
            '-y',
            '-i',
            infile,
            '-c:a',
            'copy',
            outfile,
    ]); // remux OGG to CAF
    Future<int> getLastFFmpegReturnCode() async
            int result = await getLastFFmpegReturnCode();
    Future<String> getLastFFmpegCommandOutput() async
            print( await getLastFFmpegCommandOutput() );
    Future<Map<dynamic, dynamic>> FFmpegGetMediaInformation(String uri) async
    Map<dynamic, dynamic> info = await flutterSoundHelper.FFmpegGetMediaInformation( uri );