# Sinks

Media Blocks SDK .Net

Sinks are blocks that save or stream data. They are the last blocks in the pipeline. Optionally, some sinks can have output pins to pass data to the next block in the pipeline.

SDK provides a lot of different sinks for different purposes.

# File sinks

The following file sinks are available:

  • ASF
  • AVI
  • File
  • MKV
  • MOV
  • MP4
  • MPEG-PS
  • MPEG-TS
  • MXF
  • OGG
  • WAV
  • WebM

# Network streaming

The following network streaming sinks are available:

  • Facebook Live
  • HLS
  • MJPEG over HTTP
  • NDI
  • SRT
  • SRT MPEG-TS
  • RTMP
  • YouTube Live

# File Sinks

# ASF

ASF (Advanced Systems Format): A Microsoft digital container format used to store multimedia data, designed to be platform-independent and to support scalable media types like audio and video.

Use the ASFSinkSettings class to set the parameters.

# Block info

Name: AVISinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/mpeg
audio/x-ac3
audio/x-alaw
audio/x-mulaw
audio/x-wma
Input video video/x-raw one or more
image/jpeg
video/x-divx
video/x-msmpeg
video/mpeg
video/x-h263
video/x-h264
video/x-dv
video/x-huffyuv
video/x-wmv
video/x-jpc
video/x-vp8
image/png

# The sample pipeline

graph LR;
    UniversalSourceBlock-->WMVEncoderBlock;
    UniversalSourceBlock-->WMAEncoderBlock;
    WMVEncoderBlock-->ASFSinkBlock;
    WMAEncoderBlock-->ASFSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new WMAEncoderBlock(new WMAEncoderSettings());
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

var videoEncoderBlock = new WMVEncoderBlock(new WMVEncoderSettings());
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

var sinkBlock = new ASFSinkBlock(new ASFSinkSettings(@"output.wmv"));
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# AVI

AVI (Audio Video Interleave) is a multimedia container format introduced by Microsoft. It enables simultaneous audio-with-video playback by alternating segments of audio and video data.

Use the AVISinkSettings class to set the parameters.

# Block info

Name: AVISinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/mpeg
audio/x-ac3
audio/x-alaw
audio/x-mulaw
Input video video/x-raw one or more
image/jpeg
video/x-divx
video/x-msmpeg
video/mpeg
video/x-h263
video/x-h264
video/x-dv
video/x-huffyuv
image/png

# The sample pipeline

graph LR;
    UniversalSourceBlock-->MP3EncoderBlock;
    UniversalSourceBlock-->DIVXEncoderBlock;
    MP3EncoderBlock-->AVISinkBlock;
    DIVXEncoderBlock-->AVISinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new MP3EncoderBlock(new MP3EncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

var videoEncoderBlock = new DIVXEncoderBlock(new DIVXEncoderSettings());
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

var sinkBlock = new AVISinkBlock(new AVISinkSettings(@"output.avi"));
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# File

Universal output to a file. This sink is used inside all other higher-level sinks, e.g. MP4Sink. It can be used to write RAW video or audio to a file.

# Block info

Name: FileSinkBlock.

Pin direction Media type Pins count
Input Any stream format 1

# The sample pipeline

graph LR;
    UniversalSourceBlock-->MP3EncoderBlock;
    MP3EncoderBlock-->AVISinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp3";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var mp3EncoderBlock = new MP3EncoderBlock(new MP3EncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, mp3EncoderBlock.Input);

var fileSinkBlock = new FileSinkBlock(@"output.mp3");
pipeline.Connect(mp3EncoderBlock.Output, fileSinkBlock.Input);

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# MKV

MKV (Matroska) is an open standard free container format, similar to MP4 and AVI but with more flexibility and advanced features.

Use the MKVSinkSettings class to set the parameters.

# Block info

Name: MKVSinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/mpeg
audio/x-ac3
audio/x-alaw
audio/x-mulaw
audio/x-wma
audio/x-vorbis
audio/x-opus
audio/x-flac
Input video video/x-raw one or more
image/jpeg
video/x-divx
video/x-msmpeg
video/mpeg
video/x-h263
video/x-h264
video/x-h265
video/x-dv
video/x-huffyuv
video/x-wmv
video/x-jpc
video/x-vp8
video/x-vp9
video/x-theora
image/png

# The sample pipeline

graph LR;
    UniversalSourceBlock-->VorbisEncoderBlock;
    UniversalSourceBlock-->VP9EncoderBlock;
    VorbisEncoderBlock-->MKVSinkBlock;
    VP9EncoderBlock-->MKVSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

var videoEncoderBlock = new VP9EncoderBlock(new VP9EncoderSettings() { Bitrate = 2000 });
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

var sinkBlock = new MKVSinkBlock(new MKVSinkSettings(@"output.mkv"));
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# MOV

MOV (QuickTime File Format) is a multimedia container format developed by Apple for storing video, audio, and other time-based media. It supports various codecs and is widely used for multimedia content on Apple platforms, and also in professional video editing.

Use the MOVSinkSettings class to set the parameters.

# Block info

Name: MOVSinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/mpeg
audio/x-ac3
audio/x-alaw
audio/x-mulaw
audio/AAC
Input video video/x-raw one or more
image/jpeg
video/x-divx
video/x-msmpeg
video/mpeg
video/x-h263
video/x-h264
video/x-h265
video/x-dv
video/x-huffyuv
image/png

# The sample pipeline

graph LR;
    UniversalSourceBlock-->AACEncoderBlock;
    UniversalSourceBlock-->H264EncoderBlock;
    AACEncoderBlock-->MOVSinkBlock;
    H264EncoderBlock-->MOVSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new AACEncoderBlock(new AACEncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

var videoEncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings());
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

var sinkBlock = new MOVSinkBlock(new MOVSinkSettings(@"output.mov"));
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# MP4

MP4 (MPEG-4 Part 14) is a digital multimedia container format used to store video, audio, and other data such as subtitles and images. It's widely used for sharing video content online and is compatible with a wide range of devices and platforms.

Use the MP4SinkSettings class to set the parameters.

# Block info

Name: MP4SinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/mpeg
audio/x-ac3
audio/x-alaw
audio/x-mulaw
audio/AAC
Input video video/x-raw one or more
image/jpeg
video/x-divx
video/x-msmpeg
video/mpeg
video/x-h263
video/x-h264
video/x-h265
video/x-dv
video/x-huffyuv
image/png

# The sample pipeline

graph LR;
    UniversalSourceBlock-->AACEncoderBlock;
    UniversalSourceBlock-->H264EncoderBlock;
    AACEncoderBlock-->MP4SinkBlock;
    H264EncoderBlock-->MP4SinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new AACEncoderBlock(new AACEncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

var videoEncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings());
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

var sinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4"));
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# MPEG-PS

MPEG-PS (MPEG Program Stream) is a container format for multiplexing digital audio, video, and other data. It is designed for reasonably reliable media, such as DVDs, CD-ROMs, and other disc media.

Use the MPEGPSSinkSettings class to set the parameters.

# Block info

Name: MPEGPSSinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/mpeg
audio/x-ac3
audio/x-alaw
audio/x-mulaw
Input video video/x-raw one or more
image/jpeg
video/x-msmpeg
video/mpeg
video/x-h263
video/x-h264

# The sample pipeline

graph LR;
    UniversalSourceBlock-->MP2EncoderBlock;
    UniversalSourceBlock-->MPEG2EncoderBlock;
    MP2EncoderBlock-->MPEGPSSinkBlock;
    MPEG2EncoderBlock-->MPEGPSSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new MP2EncoderBlock(new MP2EncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

var videoEncoderBlock = new MPEG2EncoderBlock(new MPEG2EncoderSettings());
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

var sinkBlock = new MPEGPSSinkBlock(new MPEGPSSinkSettings(@"output.mpg"));
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# MPEG-TS

MPEG-TS (MPEG Transport Stream) is a standard digital container format for transmission and storage of audio, video, and Program and System Information Protocol (PSIP) data. It is used in broadcast systems such as DVB, ATSC and IPTV.

Use the MPEGTSSinkSettings class to set the parameters.

# Block info

Name: MPEGTSSinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/mpeg
audio/x-ac3
audio/x-alaw
audio/x-mulaw
audio/AAC
Input video video/x-raw one or more
image/jpeg
video/x-msmpeg
video/mpeg
video/x-h263
video/x-h264
video/x-h265

# The sample pipeline

graph LR;
    UniversalSourceBlock-->AACEncoderBlock;
    UniversalSourceBlock-->H264EncoderBlock;
    AACEncoderBlock-->MPEGTSSinkBlock;
    H264EncoderBlock-->MPEGTSSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new AACEncoderBlock(new AACEncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

var videoEncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings());
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

var sinkBlock = new MPEGTSSinkBlock(new MPEGTSSinkSettings(@"output.ts"));
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# MXF

MXF (Material Exchange Format) is a container format for professional digital video and audio media, developed to address issues such as file exchange, interoperability, and to improve project workflow between production houses and content/equipment providers.

Use the MXFSinkSettings class to set the parameters.

# Block info

Name: MXFSinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/mpeg
audio/x-ac3
audio/x-alaw
audio/x-mulaw
audio/AAC
Input video video/x-raw one or more
image/jpeg
video/x-divx
video/x-msmpeg
video/mpeg
video/x-h263
video/x-h264
video/x-h265
video/x-dv
image/png

# The sample pipeline

graph LR;
    UniversalSourceBlock-->PCMEncoderBlock;
    UniversalSourceBlock-->DIVXEncoderBlock;
    PCMEncoderBlock-->MXFSinkBlock;
    DIVXEncoderBlock-->MXFSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioBlock = new PCMEncoderBlock(new PCMEncoderSettings());
pipeline.Connect(fileSource.AudioOutput, audioBlock.Input);

var videoEncoderBlock = new DIVXEncoderBlock(new DIVXEncoderSettings());
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

var sinkBlock = new MXFSinkBlock(new MXFSinkSettings(@"output.mxf"));
pipeline.Connect(audioBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# Network Streaming Sinks

# RTMP

RTMP (Real-Time Messaging Protocol): Developed by Adobe, RTMP is a protocol used for streaming audio, video, and data over the Internet, optimized for high-performance transmission. It enables efficient, low-latency communication, commonly used in live broadcasting like sports events and concerts.

Use the RTMPSinkSettings class to set the parameters.

# Block info

Name: RTMPSinkBlock.

Pin direction Media type Pins count
Input audio audio/mpeg [1,2,4] one
audio/x-adpcm
PCM [U8, S16LE]
audio/x-speex
audio/x-mulaw
audio/x-alaw
audio/x-nellymoser
Input video video/x-h264 one

# The sample pipeline

graph LR;
    VirtualVideoSourceBlock-->H264EncoderBlock;
    VirtualAudioSourceBlock-->AACEncoderBlock;
    H264EncoderBlock-->RTMPSinkBlock;
    AACEncoderBlock-->RTMPSinkBlock;

# Sample code

// Pipeline
var pipeline = new MediaBlocksPipeline();

// video and audio sources
var virtualVideoSource = new VirtualVideoSourceSettings
{
    Width = 1280,
    Height = 720,
    FrameRate = VideoFrameRate.FPS_25,
};

var videoSource = new VirtualVideoSourceBlock(virtualVideoSource);

var virtualAudioSource = new VirtualAudioSourceSettings
{
     Channels = 2,
     SampleRate = 44100,
};

var audioSource = new VirtualAudioSourceBlock(virtualAudioSource);

// H264/AAC encoders
var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings());
var aacEncoder = new AACEncoderBlock();

pipeline.Connect(videoSource.Output, h264Encoder.Input);
pipeline.Connect(audioSource.Output, aacEncoder.Input);

// RTMP sink
var sink = new RTMPSinkBlock(new RTMPSinkSettings());
pipeline.Connect(h264Encoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video));
pipeline.Connect(aacEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Audio));

// Start
await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# OGG

OGG is a free, open container format designed for efficient streaming and manipulation of high quality digital multimedia. It is developed by the Xiph.Org Foundation and supports audio codecs like Vorbis, Opus, and FLAC, and video codecs like Theora.

Use the OGGSinkSettings class to set the parameters.

# Block info

Name: OGGSinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/x-vorbis
audio/x-flac
audio/x-speex
audio/x-celt
audio/x-opus
Input video video/x-raw one or more
video/x-theora
video/x-dirac

# The sample pipeline

graph LR;
    UniversalSourceBlock-->VorbisEncoderBlock;
    UniversalSourceBlock-->TheoraEncoderBlock;
    VorbisEncoderBlock-->OGGSinkBlock;
    TheoraEncoderBlock-->OGGSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

var videoEncoderBlock = new TheoraEncoderBlock(new TheoraEncoderSettings());
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

var sinkBlock = new OGGSinkBlock(new OGGSinkSettings(@"output.ogg"));
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# WAV

WAV (Waveform Audio File Format) is an audio file format standard developed by IBM and Microsoft for storing audio bitstreams on PCs. It is the main format used on Windows systems for raw and typically uncompressed audio.

Use the WAVSinkSettings class to set the parameters.

# Block info

Name: WAVSinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one
audio/x-alaw
audio/x-mulaw

# The sample pipeline

graph LR;
    UniversalSourceBlock-->PCMEncoderBlock;
    PCMEncoderBlock-->WAVSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp3";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioBlock = new PCMEncoderBlock(new PCMEncoderSettings());
pipeline.Connect(fileSource.AudioOutput, audioBlock.Input);

var sinkBlock = new WAVSinkBlock(new WAVSinkSettings(@"output.wav"));
pipeline.Connect(audioBlock.Output, sinkBlock.Input);

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# WebM

WebM is an open, royalty-free, media file format designed for the web. WebM defines the file container structure, video and audio formats. WebM files consist of video streams compressed with the VP8 or VP9 video codecs and audio streams compressed with the Vorbis or Opus audio codecs.

Use the WebMSinkSettings class to set the parameters.

# Block info

Name: WebMSinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/x-vorbis
audio/x-opus
Input video video/x-raw one or more
video/x-vp8
video/x-vp9

# The sample pipeline

graph LR;
    UniversalSourceBlock-->VorbisEncoderBlock;
    UniversalSourceBlock-->VP9EncoderBlock;
    VorbisEncoderBlock-->WebMSinkBlock;
    VP9EncoderBlock-->WebMSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

var videoEncoderBlock = new VP9EncoderBlock(new VP9EncoderSettings());
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

var sinkBlock = new WebMSinkBlock(new WebMSinkSettings(@"output.webm"));
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# Facebook Live

Facebook Live is a feature that allows live streaming of video on Facebook. The livestream can be published to personal profiles, pages, or groups.

Use the FacebookLiveSinkSettings class to set the parameters.

# Block info

Name: FacebookLiveSinkBlock.

Pin direction Media type Pins count
Input audio audio/mpeg [1,2,4] one
audio/x-adpcm
PCM [U8, S16LE]
audio/x-speex
audio/x-mulaw
audio/x-alaw
audio/x-nellymoser
Input video video/x-h264 one

# The sample pipeline

graph LR;
    VirtualVideoSourceBlock-->H264EncoderBlock;
    VirtualAudioSourceBlock-->AACEncoderBlock;
    H264EncoderBlock-->FacebookLiveSinkBlock;
    AACEncoderBlock-->FacebookLiveSinkBlock;

# Sample code

// Pipeline
var pipeline = new MediaBlocksPipeline();

// video and audio sources
var virtualVideoSource = new VirtualVideoSourceSettings
{
    Width = 1280,
    Height = 720,
    FrameRate = VideoFrameRate.FPS_25,
};

var videoSource = new VirtualVideoSourceBlock(virtualVideoSource);

var virtualAudioSource = new VirtualAudioSourceSettings
{
     Channels = 2,
     SampleRate = 44100,
};

var audioSource = new VirtualAudioSourceBlock(virtualAudioSource);

// H264/AAC encoders
var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings());
var aacEncoder = new AACEncoderBlock();

pipeline.Connect(videoSource.Output, h264Encoder.Input);
pipeline.Connect(audioSource.Output, aacEncoder.Input);

// Facebook Live sink
var sink = new FacebookLiveSinkBlock(new FacebookLiveSinkSettings(
    "https://facebook.com/rtmp/...",
    "your_stream_key"));
pipeline.Connect(h264Encoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video));
pipeline.Connect(aacEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Audio));

// Start
await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# HLS

HLS (HTTP Live Streaming) is an HTTP-based adaptive streaming communications protocol developed by Apple. It enables adaptive bitrate streaming by breaking the stream into a sequence of small HTTP-based file segments, typically using MPEG-TS fragments as the container.

Use the HLSSinkSettings class to set the parameters.

# Block info

Name: HLSSinkBlock.

Pin direction Media type Pins count
Input audio audio/mpeg one or more
audio/x-ac3
audio/x-alaw
audio/x-mulaw
audio/AAC
Input video video/x-raw one or more
image/jpeg
video/x-msmpeg
video/mpeg
video/x-h263
video/x-h264
video/x-h265

# The sample pipeline

graph LR;
    UniversalSourceBlock-->AACEncoderBlock;
    UniversalSourceBlock-->H264EncoderBlock1;
    UniversalSourceBlock-->H264EncoderBlock2;
    UniversalSourceBlock-->H264EncoderBlock3;
    AACEncoderBlock-->HLSSinkBlock;
    H264EncoderBlock1-->HLSSinkBlock;
    H264EncoderBlock2-->HLSSinkBlock;
    H264EncoderBlock3-->HLSSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new AACEncoderBlock(new AACEncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

// 3 video encoders with different bitrates for adaptive streaming
var videoEncoderBlock1 = new H264EncoderBlock(new OpenH264EncoderSettings { Bitrate = 3000, Width = 1920, Height = 1080 });
var videoEncoderBlock2 = new H264EncoderBlock(new OpenH264EncoderSettings { Bitrate = 1500, Width = 1280, Height = 720 });
var videoEncoderBlock3 = new H264EncoderBlock(new OpenH264EncoderSettings { Bitrate = 800, Width = 854, Height = 480 });

pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock1.Input);
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock2.Input);
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock3.Input);

// Configure HLS sink
var hlsSettings = new HLSSinkSettings("./output/")
{
    PlaylistName = "playlist.m3u8",
    SegmentDuration = 6,
    PlaylistType = HLSPlaylistType.Event,
    HTTPServerEnabled = true,
    HTTPServerPort = 8080
};

var sinkBlock = new HLSSinkBlock(hlsSettings);

// Connect audio
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));

// Connect video variants
pipeline.Connect(videoEncoderBlock1.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video, "1080p"));
pipeline.Connect(videoEncoderBlock2.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video, "720p"));
pipeline.Connect(videoEncoderBlock3.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video, "480p"));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# MJPEG over HTTP

HTTP MJPEG (Motion JPEG) Live is a video streaming format where each video frame is compressed separately as a JPEG image and transmitted over HTTP. It is widely used in IP cameras and webcams due to its simplicity, although it is less efficient than modern codecs.

Use the HTTPMJPEGLiveSinkSettings class to set the parameters.

# Block info

Name: HTTPMJPEGLiveSinkBlock.

Pin direction Media type Pins count
Input video video/x-raw one
image/jpeg

# The sample pipeline

graph LR;
    VirtualVideoSourceBlock-->MJPEGEncoderBlock;
    MJPEGEncoderBlock-->HTTPMJPEGLiveSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

// Create virtual video source
var virtualVideoSource = new VirtualVideoSourceSettings
{
    Width = 1280,
    Height = 720,
    FrameRate = VideoFrameRate.FPS_30,
};

var videoSource = new VirtualVideoSourceBlock(virtualVideoSource);

// MJPEG encoder
var mjpegEncoder = new MJPEGEncoderBlock(new MJPEGEncoderSettings { Quality = 80 });
pipeline.Connect(videoSource.Output, mjpegEncoder.Input);

// HTTP MJPEG server
var sink = new HTTPMJPEGLiveSinkBlock(new HTTPMJPEGLiveSinkSettings
{
    Port = 8080,
    Path = "/stream"
});
pipeline.Connect(mjpegEncoder.Output, sink.Input);

// Start
await pipeline.StartAsync();

Console.WriteLine("MJPEG stream available at http://localhost:8080/stream");
Console.WriteLine("Press any key to stop...");
Console.ReadKey();

# Platforms

Windows, macOS, Linux, iOS, Android.

# NDI

NDI (Network Device Interface) is a royalty-free video transport standard developed by NewTek that enables video-compatible products to communicate, deliver, and receive broadcast-quality video in a high-quality, low-latency manner over standard Ethernet networks.

Use the NDISinkSettings class to set the parameters.

# Block info

Name: NDISinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one
Input video video/x-raw one

# The sample pipeline

graph LR;
    UniversalSourceBlock-->NDISinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var sinkBlock = new NDISinkBlock(new NDISinkSettings("My NDI Stream"));
pipeline.Connect(fileSource.AudioOutput, sinkBlock.AudioInput);
pipeline.Connect(fileSource.VideoOutput, sinkBlock.VideoInput);

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux.

# SRT

SRT (Secure Reliable Transport) is an open source video transport protocol that enables the delivery of high-quality, secure, low-latency video across unpredictable networks like the public internet. It was developed by Haivision.

Use the SRTSinkSettings class to set the parameters.

# Block info

Name: SRTSinkBlock.

Pin direction Media type Pins count
Input Any stream format 1

# The sample pipeline

graph LR;
    UniversalSourceBlock-->MP4MuxerBlock;
    MP4MuxerBlock-->SRTSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

// Create a multiplexer block to combine audio and video
var muxer = new MP4MuxerBlock();
pipeline.Connect(fileSource.AudioOutput, muxer.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(fileSource.VideoOutput, muxer.CreateNewInput(MediaBlockPadMediaType.Video));

// Create SRT sink in caller mode (connecting to a listener)
var srtSettings = new SRTSinkSettings
{
    Host = "srt-server.example.com",
    Port = 1234,
    Mode = SRTMode.Caller,
    Latency = 200, // milliseconds
    Passphrase = "optional-encryption-passphrase"
};

var srtSink = new SRTSinkBlock(srtSettings);
pipeline.Connect(muxer.Output, srtSink.Input);

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# SRT MPEG-TS

SRT MPEG-TS is a combination of the SRT transport protocol with MPEG-TS container format. This allows secure, reliable transport of MPEG-TS streams over public networks, which is useful for broadcast and professional video workflows.

Use the SRTMPEGTSSinkSettings class to set the parameters.

# Block info

Name: SRTMPEGTSSinkBlock.

Pin direction Media type Pins count
Input audio audio/x-raw one or more
audio/mpeg
audio/x-ac3
audio/x-alaw
audio/x-mulaw
audio/AAC
Input video video/x-raw one or more
image/jpeg
video/x-msmpeg
video/mpeg
video/x-h263
video/x-h264
video/x-h265

# The sample pipeline

graph LR;
    UniversalSourceBlock-->AACEncoderBlock;
    UniversalSourceBlock-->H264EncoderBlock;
    AACEncoderBlock-->SRTMPEGTSSinkBlock;
    H264EncoderBlock-->SRTMPEGTSSinkBlock;

# Sample code

var pipeline = new MediaBlocksPipeline();

var filename = "test.mp4";
var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename)));

var audioEncoderBlock = new AACEncoderBlock(new AACEncoderSettings() { Bitrate = 192 });
pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input);

var videoEncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings());
pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input);

// Configure SRT MPEG-TS sink
var srtMpegtsSinkSettings = new SRTMPEGTSSinkSettings
{
    Host = "srt-server.example.com",
    Port = 1234,
    Mode = SRTMode.Caller,
    Latency = 200,
    Passphrase = "optional-encryption-passphrase"
};

var sinkBlock = new SRTMPEGTSSinkBlock(srtMpegtsSinkSettings);
pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio));
pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video));

await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.

# YouTube Live

YouTube Live is a live streaming service provided by YouTube. It allows creators to broadcast live videos to their audience through the YouTube platform.

Use the YouTubeSinkSettings class to set the parameters.

# Block info

Name: YouTubeSinkBlock.

Pin direction Media type Pins count
Input audio audio/mpeg [1,2,4] one
audio/x-adpcm
PCM [U8, S16LE]
audio/x-speex
audio/x-mulaw
audio/x-alaw
audio/x-nellymoser
Input video video/x-h264 one

# The sample pipeline

graph LR;
    VirtualVideoSourceBlock-->H264EncoderBlock;
    VirtualAudioSourceBlock-->AACEncoderBlock;
    H264EncoderBlock-->YouTubeSinkBlock;
    AACEncoderBlock-->YouTubeSinkBlock;

# Sample code

// Pipeline
var pipeline = new MediaBlocksPipeline();

// video and audio sources
var virtualVideoSource = new VirtualVideoSourceSettings
{
    Width = 1920,
    Height = 1080,
    FrameRate = VideoFrameRate.FPS_30,
};

var videoSource = new VirtualVideoSourceBlock(virtualVideoSource);

var virtualAudioSource = new VirtualAudioSourceSettings
{
     Channels = 2,
     SampleRate = 48000,
};

var audioSource = new VirtualAudioSourceBlock(virtualAudioSource);

// H264/AAC encoders
var h264Settings = new OpenH264EncoderSettings
{
    Bitrate = 4000, // 4 Mbps for 1080p
    KeyframeInterval = 2 // Keyframe every 2 seconds
};
var h264Encoder = new H264EncoderBlock(h264Settings);

var aacSettings = new AACEncoderSettings
{
    Bitrate = 192 // 192 kbps for audio
};
var aacEncoder = new AACEncoderBlock(aacSettings);

pipeline.Connect(videoSource.Output, h264Encoder.Input);
pipeline.Connect(audioSource.Output, aacEncoder.Input);

// YouTube Live sink
var sink = new YouTubeSinkBlock(new YouTubeSinkSettings(
    "rtmp://a.rtmp.youtube.com/live2/",
    "your_youtube_stream_key"));
pipeline.Connect(h264Encoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video));
pipeline.Connect(aacEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Audio));

// Start
await pipeline.StartAsync();

# Platforms

Windows, macOS, Linux, iOS, Android.