#System audio source

Media Blocks SDK .Net

SystemAudioSourceBlock is used to access mics and other audio capture devices.

#Block info

Name: SystemAudioSourceBlock.

Pin directionMedia typePins count
Output audiouncompressed audio1

#Enumerate available devices

Use the DeviceEnumerator.Shared.AudioSourcesAsync() method call to get a list of available devices and their specifications.

During device enumeration, you can get the list of available devices and their specifications. You can select the device and its format to create the source settings.

#The sample pipeline

SystemAudioSourceBlock
AudioRendererBlock

#Sample code

// create pipeline var pipeline = new MediaBlocksPipeline(); // create audio source block IAudioCaptureDeviceSourceSettings audioSourceSettings = null; // select first device var device = (await DeviceEnumerator.Shared.AudioSourcesAsync())[0]; if (device != null) { // select first format var formatItem = device.Formats[0]; if (formatItem != null) { audioSourceSettings = device.CreateSourceSettings(formatItem.ToFormat()); } } // create audio source block using selected device and format var audioSource = new SystemAudioSourceBlock(audioSourceSettings); // create audio renderer block var audioRenderer = new AudioRendererBlock(); // connect blocks pipeline.Connect(audioSource.Output, audioRenderer.Input); // start pipeline await pipeline.StartAsync();

#Capture audio from speakers (loopback)

Currently, loopback audio capture is supported only on Windows. Use the LoopbackAudioCaptureDeviceSourceSettings class to create the source settings for loopback audio capture.

WASAPI2 is used as the default API for loopback audio capture. You can specify the API to use during device enumeration.

// create pipeline var pipeline = new MediaBlocksPipeline(); // create audio source block var deviceItem = (await DeviceEnumerator.Shared.AudioOutputsAsync(AudioOutputDeviceAPI.WASAPI2))[0]; if (deviceItem == null) { return; } var audioSourceSettings = new LoopbackAudioCaptureDeviceSourceSettings(deviceItem); var audioSource = new SystemAudioSourceBlock(audioSourceSettings); // create audio renderer block var audioRenderer = new AudioRendererBlock(); // connect blocks pipeline.Connect(audioSource.Output, audioRenderer.Input); // start pipeline await pipeline.StartAsync();

#Sample applications

#Remarks

You can specify an API to use during the device enumeration. Android and iOS platforms have only one API, while Windows and Linux have multiple APIs.

#Platforms

Windows, macOS, Linux, iOS, Android.