arrow-left

All pages
gitbookPowered by GitBook
1 of 10

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

guides_record_stream

Recording PCM-16 to a Dart Stream.

hashtag
Recording PCM-16 to a Dart Stream

Please, remember that actually, Flutter Sound does not support Floating Point PCM data, nor records with more that one audio channel. On Flutter Sound, Raw PCM is only PCM-LINEAR 16 monophony

To record a Live PCM file, when calling the verb startRecorder(), you specify the parameter toStream: with you Stream sink, instead of the parameter toFile:. This parameter is a StreamSink that you can listen to, for processing the input data.

hashtag
Notes :

  • This new functionnality needs, at least, an Android SDK >= 21

  • This new functionnality works better with Android minSdk >= 23, because previous SDK was not able to do UNBLOCKING write.

Example

You can look to the provided with Flutter Sound.

  IOSink outputFile = await createFile();
  StreamController<Food> recordingDataController = StreamController<Food>();
  _mRecordingDataSubscription =
          recordingDataController.stream.listen
            ((Uint8List buffer)
              {
                outputFile.add(buffer);
              }
            );
  await _mRecorder.startRecorder(
        toStream: recordingDataController.sink,
        codec: Codec.pcm16,
        numChannels: 1,
        sampleRate: 48000,
  );
simple examplearrow-up-right

guides

guides_play_from_stream

Playing PCM-16 from a Dart Stream.

hashtag
Playing PCM-16 from a Dart Stream

Please, remember that actually, Flutter Sound does not support Floating Point PCM data, nor records with more that one audio channel.

To play live stream, you start playing with the verb startPlayerFromStream() instead of the regular startPlayer() verb:

The first thing you have to do if you want to play live audio is to answer this question: Do I need back pressure from Flutter Sound, or not?

hashtag
Without back pressure,

The App does just myPlayer.foodSink.add( FoodData(aBuffer) ) each time it wants to play some data. No need to await, no need to verify if the previous buffers have finished to be played. All the buffers added to foodSink are buffered, an are played sequentially. The App continues to work without knowing when the buffers are really played.

This means two things :

  • If the App is very fast adding buffers to foodSink it can consume a lot of memory for the waiting buffers.

  • When the App has finished feeding the sink, it cannot just do myPlayer.stopPlayer(), because there is perhaps many buffers not yet played.

    If it does a stopPlayer(), all the waiting buffers will be flushed which is probably not what it wants.

But there is a mechanism if the App wants to resynchronize with the output Stream. To resynchronize with the current playback, the App does myPlayer.foodSink.add( FoodEvent(aCallback) );

Example:

You can look to this simple provided with Flutter Sound.

hashtag
With back pressure

If the App wants to keep synchronization with what is played, it uses the verb feedFromStream() to play data. It is really very important not to call another feedFromStream() before the completion of the previous future. When each Future is completed, the App can be sure that the provided data are correctely either played, or at least put in low level internal buffers, and it knows that it is safe to do another one.

Example:

You can look to this and

You probably will await or use then() for each call to feedFromStream().

hashtag
Notes :

  • This new functionnality needs, at least, an Android SDK >= 21

  • This new functionnality works better with Android minSdk >= 23, because previous SDK was not able to do UNBLOCKING write.

Examples You can look to the provided examples :

  • shows how to play Live data, with Back Pressure from Flutter Sound

  • shows how to play Live data, without Back Pressure from Flutter Sound

  • shows how to play some real time sound effects.

Getting Started

Getting Started..

hashtag
Playback

The complete running example

hashtag

Recording or playing Raw PCM INT-Linerar 16 files

Recording PCM.

Please, remember that actually, Flutter Sound does not support Floating Point PCM data, nor records with more that one audio channel.

To record a Raw PCM16 file, you use the regular startRecorder() API verb. To play a Raw PCM16 file, you can either add a Wave header in front of the file with pcm16ToWave() verb, or call the regular startPlayer() API verb. If you do the later, you must provide the sampleRate and numChannels parameter during the call. You can look to the simple example provided with Flutter Sound. [TODO]

Example

Notification/Lock Screen

Controls on the lock-screen.

A number of Platforms (android/IOS) support the concept of a 'Shade' or 'notification' area with the ability to control audio playback via the Shade.

When using a Shade a Platform may also allow the user to control the media playback from the Platform's 'Lock' screen.

Using a Shade does not stop you from also displaying an in app Widget to control audio. The SoundPlayerUI widget will work in conjunction with the Shade.

The Shade may also display information contained in the such as Album, Artist of artwork.

A Shade often allows the user to pause and resume audio as well skip forward a track and skip backward to the prior Track.

guides-pcm-wave

Raw PCM and Wave files.

hashtag
Raw PCM and Wave files

Raw PCM is not an audio format. Raw PCM files store the raw data without any envelope. A simple way for playing a Raw PCM file, is to add a Wave header in front of the data before playing it. To do that, the helper verb pcmToWave() is convenient. You can also call directely the startPlayer() verb. If you do that, do not forget to provide the sampleRate

guides

Various guides about The &tau; Project.

hashtag
Supported Codecs

hashtag
On mobile OS

Supported Codecs

Supported codecs.

hashtag
On mobile OS

Actually, the following codecs are supported by flutter_sound:

Widgets

The &tau; built-in widgets.

The easiest way to start with Sounds is to use one of the built in Widgets.

  • SoundPlayerUI

  • SoundRecorderUI

await myPlayer.startPlayerFromStream
(
    codec: Codec.pcm16 // Actually this is the only codec possible
    numChannels: 1 // Actually this is the only value possible. You cannot have several channels.
    sampleRate: 48100 // This parameter is very important if you want to specify your own sample rate
);
and
numChannels
parameters.

A Wave file is just PCM data in a specific file format.

The Wave audio file format has a terrible drawback : it cannot be streamed. The Wave file is considered not valid, until it is closed. During the construction of the Wave file, it is considered as corrupted because the Wave header is still not written.

Note the following limitations in the current Flutter Sound version :

  • The stream is PCM-Integer Linear 16 with just one channel. Actually, Flutter Sound does not manipulate Raw PCM with floating point PCM data nor with more than one audio channel.

  • FlutterSoundHelper duration() does not work with Raw PCM file

  • startPlayer() does not return the record duration.

  • withUI parameter in openAudioSession() is actually incompatible with Raw PCM files.

This examplearrow-up-right play live stream what is recorded from the microphone.

examplearrow-up-right
examplearrow-up-right
this examplearrow-up-right
This examplearrow-up-right
This examplearrow-up-right
This examplearrow-up-right
Directory tempDir = await getTemporaryDirectory();
String outputFile = '${tempDir.path}/myFile.pcm';

await myRecorder.startRecorder
(
    codec: Codec.pcm16,
    toFile: outputFile,
    sampleRate: 16000,
    numChannels: 1,
);

...
myRecorder.stopRecorder();
...

await myPlayer.startPlayer
(
        fromURI: = outputFile,
        codec: Codec.pcm16,
        numChannels: 1,
        sampleRate: 16000, // Used only with codec == Codec.pcm16
        whenFinished: (){ /* Do something */},

);
1. FlutterSoundPlayer instanciation

To play back something you must instanciate a player. Most of the time, you will need just one player, and you can place this instanciation in the variables initialisation of your class :

hashtag
2. Open and close the audio session

Before calling startPlayer() you must open the Session.

When you have finished with it, you must close the session. A good places to put those verbs are in the procedures initState() and dispose().

hashtag
3. Play your sound

To play a sound you call startPlayer(). To stop a sound you call stopPlayer()

hashtag
Recording

The complete running example is therearrow-up-right

hashtag
1. FlutterSoundRecorder instanciation

To play back something you must instanciate a recorder. Most of the time, you will need just one recorder, and you can place this instanciation in the variables initialisation of your class :

hashtag
2. Open and close the audio session

Before calling startRecorder() you must open the Session.

When you have finished with it, you must close the session. A god place to pute those verbs is in the procedures initState() and dispose().

hashtag
3. Record something

To record something you call startRecorder(). To stop the recorder you call stopRecorder()

is therearrow-up-right
Ο„ allows you to enable the Shade controls when you start playback. It also allows you (where the Platform supports it) to control which of the media buttons are displayed (pause, resume, skip forward, skip backwards).

To start audio playback using the Shade use:

The withShadeUIconstuctor allows you to control which of the Shade buttons are displayed. The Platform MAY choose to ignore any of the button choices you make.

hashtag
Skipping Tracks

If you allow the Shade to display the Skip Forward and Skip Back buttons you must provide callbacks for the onSkipForward and on onSkipBackward methods. When the user clicks the respective buttons you will receive the relevant callback.

Trackarrow-up-right
SoundPlayer.withShadeUI(track);
var player = SoundPlayer.withShadeUI(track, canSkipBackward:true
    , canSkipForward:true);
player.onSkipBackwards = () => player.startPlayer(getPreviousTrack());
player.onSkipForwards = () => player.startPlayer(getNextTrack());
myPlayer.foodSink.add
( FoodEvent
  (
     () async
     {
          await myPlayer.stopPlayer();
          setState((){});
     }
  )
);
await myPlayer.startPlayerFromStream(codec: Codec.pcm16, numChannels: 1, sampleRate: 48000);

myPlayer.foodSink.add(FoodData(aBuffer));
myPlayer.foodSink.add(FoodData(anotherBuffer));
myPlayer.foodSink.add(FoodData(myOtherBuffer));

myPlayer.foodSink.add(FoodEvent((){_mPlayer.stopPlayer();}));
await myPlayer.startPlayerFromStream(codec: Codec.pcm16, numChannels: 1, sampleRate: 48000);

await myPlayer.feedFromStream(aBuffer);
await myPlayer.feedFromStream(anotherBuffer);
await myPlayer.feedFromStream(myOtherBuffer);

await myPlayer.stopPlayer();
  import 'package:flauto/flutter_sound.dart';
...
  FlutterSoundPlayer _myPlayer = FlutterSoundPlayer();
@override
  void initState() {
    super.initState();
    // Be careful : openAudioSession return a Future.
    // Do not access your FlutterSoundPlayer or FlutterSoundRecorder before the completion of the Future
    _myPlayer.openAudioSession().then((value) {
      setState(() {
        _mPlayerIsInited = true;
      });
    });
  }



  @override
  void dispose() {
    // Be careful : you must `close` the audio session when you have finished with it.
    _myPlayer.closeAudioSession();
    _myPlayer = null;
    super.dispose();
  }
void play() async {
    await _myPlayer.startPlayer(
      fromURI: _exampleAudioFilePathMP3,
      codec: Codec.mp3,
      whenFinished: (){setState((){});}
    );
    setState(() {});
  }

  Future<void> stopPlayer() async {
    if (_myPlayer != null) {
      await _myPlayer.stopPlayer();
    }
  }
  FlutterSoundRecorder _myRecorder = FlutterSoundRecorder();
@override
  void initState() {
    super.initState();
    // Be careful : openAudioSession return a Future.
    // Do not access your FlutterSoundPlayer or FlutterSoundRecorder before the completion of the Future
    _myRecorder.openAudioSession().then((value) {
      setState(() {
        _mRecorderIsInited = true;
      });
    });
  }



  @override
  void dispose() {
    // Be careful : you must `close` the audio session when you have finished with it.
    _myRecorder.closeAudioSession();
    _myRecorder = null;
    super.dispose();
  }
  Future<void> record() async {
    await _myRecorder.startRecorder(
      toFile: _mPath,
      codec: Codec.aacADTS,
    );
  }


  Future<void> stopRecorder() async {
    await _myRecorder.stopRecorder();
  }
Actually, the following codecs are supported by flutter_sound:

iOS encoder

iOS decoder

Android encoder

Android decoder

AAC ADTS

βœ…

βœ…

βœ… (1)

βœ…

Opus OGG

βœ… (*)

βœ… (*)

❌

This table will eventually be upgraded when more codecs will be added.

  • βœ… (*) : The codec is supported by Flutter Sound, but with a File Format Conversion. This has several drawbacks :

    • Needs FFmpeg. FFmpeg is not included in the LITE flavor of Flutter Sound

    • Can add some delay before Playing Back the file, or after stopping the recording. This delay can be substancial for very large records.

  • βœ… (1) : needs MinSDK >=23

hashtag
On Web browsers

Chrome encoder

Chrome decoder

Firefox encoder

Firefox decoder

Webkit encoder (safari)

Webkit decoder (Safari)

AAC ADTS

❌

βœ…

❌

βœ…

❌

  • Webkit is bull shit : you cannot record anything with Safari, or even Firefox/Chrome on iOS.

  • Opus WEBM is a great Codec. It works on everything (mobile and Web Browsers), except Apple

  • Edge is same as Chrome

hashtag
Raw PCM and Wave files

Raw PCM is not an audio format. Raw PCM files store the raw data without any envelope. A simple way for playing a Raw PCM file, is to add a Wave header in front of the data before playing it. To do that, the helper verb pcmToWave() is convenient. You can also call directely the startPlayer() verb. If you do that, do not forget to provide the sampleRate and numChannels parameters.

A Wave file is just PCM data in a specific file format.

The Wave audio file format has a terrible drawback : it cannot be streamed. The Wave file is considered not valid, until it is closed. During the construction of the Wave file, it is considered as corrupted because the Wave header is still not written.

Note the following limitations in the current Flutter Sound version :

  • The stream is PCM-Integer Linear 16 with just one channel. Actually, Flutter Sound does not manipulate Raw PCM with floating point PCM data nor with more than one audio channel.

  • FlutterSoundHelper duration() does not work with Raw PCM file

  • startPlayer() does not return the record duration.

  • withUI parameter in openAudioSession() is actually incompatible with Raw PCM files.

hashtag
Recording or playing Raw PCM INT-Linerar 16 files

Please, remember that actually, Flutter Sound does not support Floating Point PCM data, nor records with more that one audio channel.

To record a Raw PCM16 file, you use the regular startRecorder() API verb. To play a Raw PCM16 file, you can either add a Wave header in front of the file with pcm16ToWave() verb, or call the regular startPlayer() API verb. If you do the later, you must provide the sampleRate and numChannels parameter during the call. You can look to the simple example provided with Flutter Sound. [TODO]

Example

hashtag
Recording PCM-16 to a Dart Stream

Please, remember that actually, Flutter Sound does not support Floating Point PCM data, nor records with more that one audio channel. On Flutter Sound, Raw PCM is only PCM-LINEAR 16 monophony

To record a Live PCM file, when calling the verb startRecorder\(\), you specify the parameter toStream: with you Stream sink, instead of the parameter toFile:. This parameter is a StreamSink that you can listen to, for processing the input data.

hashtag
Notes :

  • This new functionnality needs, at least, an Android SDK >= 21

  • This new functionnality works better with Android minSdk >= 23, because previous SDK was not able to do UNBLOCKING write.

Example

You can look to the simple examplearrow-up-right provided with Flutter Sound.

hashtag
Playing PCM-16 from a Dart Stream

Please, remember that actually, Flutter Sound does not support Floating Point PCM data, nor records with more that one audio channel.

To play live stream, you start playing with the verb startPlayerFromStream instead of the regular startPlayer() verb:

The first thing you have to do if you want to play live audio is to answer this question: Do I need back pressure from Flutter Sound, or not?

hashtag
Without back pressure,

The App does just myPlayer.foodSink.add\( FoodData\(aBuffer\) \) each time it wants to play some data. No need to await, no need to verify if the previous buffers have finished to be played. All the buffers added to foodSink are buffered, an are played sequentially. The App continues to work without knowing when the buffers are really played.

This means two things :

  • If the App is very fast adding buffers to foodSink it can consume a lot of memory for the waiting buffers.

  • When the App has finished feeding the sink, it cannot just do myPlayer.stopPlayer(), because there is perhaps many buffers not yet played.

    If it does a stopPlayer(), all the waiting buffers will be flushed which is probably not what it wants.

But there is a mechanism if the App wants to resynchronize with the output Stream. To resynchronize with the current playback, the App does myPlayer.foodSink.add\( FoodEvent\(aCallback\) \);

Example:

You can look to this simple examplearrow-up-right provided with Flutter Sound.

hashtag
With back pressure

If the App wants to keep synchronization with what is played, it uses the verb feedFromStream to play data. It is really very important not to call another feedFromStream() before the completion of the previous future. When each Future is completed, the App can be sure that the provided data are correctely either played, or at least put in low level internal buffers, and it knows that it is safe to do another one.

Example:

You can look to this examplearrow-up-right and this examplearrow-up-right

You probably will await or use then() for each call to feedFromStream().

hashtag
Notes :

  • This new functionnality needs, at least, an Android SDK >= 21

  • This new functionnality works better with Android minSdk >= 23, because previous SDK was not able to do UNBLOCKING write.

Examples You can look to the provided examples :

  • This examplearrow-up-right shows how to play Live data, with Back Pressure from Flutter Sound

  • This examplearrow-up-right shows how to play Live data, without Back Pressure from Flutter Sound

  • This examplearrow-up-right shows how to play some real time sound effects.

  • play live stream what is recorded from the microphone.

AAC ADTS

βœ…

βœ…

βœ… (1)

βœ…

Opus OGG

βœ… (*)

βœ… (*)

❌

βœ… (1)

Opus CAF

βœ…

βœ…

❌

βœ… (*) (1)

MP3

❌

βœ…

❌

βœ…

Vorbis OGG

❌

❌

❌

βœ…

PCM16

βœ…

βœ…

βœ… (1)

βœ…

PCM Wave

βœ…

βœ…

βœ… (1)

βœ…

PCM AIFF

❌

βœ…

❌

βœ… (*)

PCM CAF

βœ…

βœ…

❌

βœ… (*)

FLAC

βœ…

βœ…

❌

βœ…

AAC MP4

βœ…

βœ…

βœ… (1)

βœ…

AMR NB

❌

❌

βœ… (1)

βœ…

AMR WB

❌

❌

βœ… (1)

βœ…

PCM8

❌

❌

❌

❌

PCM F32

❌

❌

❌

❌

PCM WEBM

❌

❌

❌

❌

Opus WEBM

❌

❌

βœ…

βœ…

Vorbis WEBM

❌

❌

❌

βœ…

This table will eventually be upgraded when more codecs will be added.

  • βœ… (*) : The codec is supported by Flutter Sound, but with a File Format Conversion. This has several drawbacks :

    • Needs FFmpeg. FFmpeg is not included in the LITE flavor of Flutter Sound

    • Can add some delay before Playing Back the file, or after stopping the recording. This delay can be substancial for very large records.

  • βœ… (1) : needs MinSDK >=23

hashtag
On Web browsers

Chrome encoder

Chrome decoder

Firefox encoder

Firefox decoder

Webkit encoder (safari)

Webkit decoder (Safari)

AAC ADTS

❌

βœ…

❌

βœ…

❌

  • Webkit is bull shit : you cannot record anything with Safari, or even Firefox/Chrome on iOS.

  • Opus WEBM is a great Codec. It works on everything (mobile and Web Browsers), except Apple

  • Edge is same as Chrome

iOS encoder

iOS decoder

Android encoder

Android decoder

RecorderPlaybackController

If you don't like any of the provided Widgets you can build your own from scratch.

The Sounds widgets are all built using the public Sounds API and also provide working examples when building your own widget.

hashtag
SoundPlayerUI

The SoundPlayerUI widget provides a Playback widget styled after the HTML 5 audio player.

The player displays a loading indicator and allows the user to pause/resume/skip via the progress bar.

You can also pause/resume the player via an api call to SoundPlayerUI's state using a GlobalKey.

The SoundPlayerUIarrow-up-right api documentation provides examples on using the SoundPlayerUI widget.

hashtag
SoundRecorderUI

The SoundRecorderUI widget provide a simple UI for recording audio.

The audio is recorded to a Trackarrow-up-right.

TODO: add image here.

The SoundRecorderUIarrow-up-right api documentation provides examples on using the SoundRecorderUIarrow-up-right widget.

hashtag
RecorderPlaybackController

The RecorderPlaybackController is a specialised Widget which is used to co-ordinate a paired SoundPlayerUI and a SoundRecorderUI widgets.

Often when providing an interface to record audio you will want to allow the user to playback the audio after recording it. However you don't want the user to try and start the playback before the recording is complete.

The RecorderPlaybackController widget does not have a UI (its actually an InheritedWidget) but rather is used to as a bridge to allow the paired SoundPlayerUI and SoundRecorderUI to communicate with each other.

The RecorderPlaybackController co-ordinates the UI state between the two components so that playback and recording cannot happen at the same time.

See the API documenation on RecorderPlaybackControllerarrow-up-right for examples of how to use it.

Directory tempDir = await getTemporaryDirectory();
String outputFile = '${tempDir.path}/myFile.pcm';

await myRecorder.startRecorder
(
    codec: Codec.pcm16,
    toFile: outputFile,
    sampleRate: 16000,
    numChannels: 1,
);

...
myRecorder.stopRecorder();
...

await myPlayer.startPlayer
(
        fromURI: = outputFile,
        codec: Codec.pcm16,
        numChannels: 1,
        sampleRate: 16000, // Used only with codec == Codec.pcm16
        whenFinished: (){ /* Do something */},

);
  IOSink outputFile = await createFile();
  StreamController<Food> recordingDataController = StreamController<Food>();
  _mRecordingDataSubscription =
          recordingDataController.stream.listen
            ((Uint8List buffer)
              {
                outputFile.add(buffer);
              }
            );
  await _mRecorder.startRecorder(
        toStream: recordingDataController.sink,
        codec: Codec.pcm16,
        numChannels: 1,
        sampleRate: 48000,
  );
await myPlayer.startPlayerFromStream
(
    codec: Codec.pcm16 // Actually this is the only codec possible
    numChannels: 1 // Actually this is the only value possible. You cannot have several channels.
    sampleRate: 48100 // This parameter is very important if you want to specify your own sample rate
);
myPlayer.foodSink.add
( FoodEvent
  (
     () async
     {
          await myPlayer.stopPlayer();
          setState((){});
     }
  )
);
await myPlayer.startPlayerFromStream(codec: Codec.pcm16, numChannels: 1, sampleRate: 48000);

myPlayer.foodSink.add(FoodData(aBuffer));
myPlayer.foodSink.add(FoodData(anotherBuffer));
myPlayer.foodSink.add(FoodData(myOtherBuffer));

myPlayer.foodSink.add(FoodEvent((){_mPlayer.stopPlayer();}));
await myPlayer.startPlayerFromStream(codec: Codec.pcm16, numChannels: 1, sampleRate: 48000);

await myPlayer.feedFromStream(aBuffer);
await myPlayer.feedFromStream(anotherBuffer);
await myPlayer.feedFromStream(myOtherBuffer);

await myPlayer.stopPlayer();

βœ…

Opus OGG

❌

βœ…

βœ…

βœ…

❌

❌

Opus CAF

❌

❌

❌

❌

❌

βœ…

MP3

❌

βœ…

❌

βœ…

❌

βœ…

Vorbis OGG

❌

βœ…

❌

βœ…

❌

❌

PCM16

❌

βœ…

❌

βœ…

❌

❌

(must be verified)

PCM Wave

❌

βœ…

❌

βœ…

❌

❌

PCM AIFF

❌

❌

❌

❌

❌

❌

PCM CAF

❌

❌

❌

❌

❌

βœ…

FLAC

❌

βœ…

❌

βœ…

❌

βœ…

AAC MP4

❌

βœ…

❌

βœ…

❌

βœ…

AMR NB

❌

❌

❌

❌

❌

❌

AMR WB

❌

❌

❌

❌

❌

❌

PCM8

❌

❌

❌

❌

❌

❌

PCM F32

❌

❌

❌

❌

❌

❌

PCM WEBM

❌

❌

❌

❌

❌

❌

Opus WEBM

βœ…

βœ…

βœ…

βœ…

❌

❌

Vorbis WEBM

❌

βœ…

❌

βœ…

❌

❌

βœ… (1)

Opus CAF

βœ…

βœ…

❌

βœ… (*) (1)

MP3

❌

βœ…

❌

βœ…

Vorbis OGG

❌

❌

❌

βœ…

PCM16

βœ…

βœ…

βœ… (1)

βœ…

PCM Wave

βœ…

βœ…

βœ… (1)

βœ…

PCM AIFF

❌

βœ…

❌

βœ… (*)

PCM CAF

βœ…

βœ…

❌

βœ… (*)

FLAC

βœ…

βœ…

❌

βœ…

AAC MP4

βœ…

βœ…

βœ… (1)

βœ…

AMR NB

❌

❌

βœ… (1)

βœ…

AMR WB

❌

❌

βœ… (1)

βœ…

PCM8

❌

❌

❌

❌

PCM F32

❌

❌

❌

❌

PCM WEBM

❌

❌

❌

❌

Opus WEBM

❌

❌

βœ…

βœ…

Vorbis WEBM

❌

❌

❌

βœ…

βœ…

Opus OGG

❌

βœ…

βœ…

βœ…

❌

❌

Opus CAF

❌

❌

❌

❌

❌

βœ…

MP3

❌

βœ…

❌

βœ…

❌

βœ…

Vorbis OGG

❌

βœ…

❌

βœ…

❌

❌

PCM16

❌

βœ…

❌

βœ…

❌

❌

(must be verified)

PCM Wave

❌

βœ…

❌

βœ…

❌

❌

PCM AIFF

❌

❌

❌

❌

❌

❌

PCM CAF

❌

❌

❌

❌

❌

βœ…

FLAC

❌

βœ…

❌

βœ…

❌

βœ…

AAC MP4

❌

βœ…

❌

βœ…

❌

βœ…

AMR NB

❌

❌

❌

❌

❌

❌

AMR WB

❌

❌

❌

❌

❌

❌

PCM8

❌

❌

❌

❌

❌

❌

PCM F32

❌

❌

❌

❌

❌

❌

PCM WEBM

❌

❌

❌

❌

❌

❌

Opus WEBM

βœ…

βœ…

βœ…

βœ…

❌

❌

Vorbis WEBM

❌

βœ…

❌

βœ…

❌

❌

This examplearrow-up-right