Sinks are blocks that save or stream data. They are the last blocks in the pipeline. Optionally, some sinks can have output pins to pass data to the next block in the pipeline.
SDK provides a lot of different sinks for different purposes.
ASF (Advanced Systems Format): A Microsoft digital container format used to store multimedia data, designed to be platform-independent and to support scalable media types like audio and video.
Use the ASFSinkSettings class to set the parameters.
AVI (Audio Video Interleave) is a multimedia container format introduced by Microsoft. It enables simultaneous audio-with-video playback by alternating segments of audio and video data.
Use the AVISinkSettings class to set the parameters.
MOV (QuickTime File Format) is a multimedia container format developed by Apple for storing video, audio, and other time-based media. It supports various codecs and is widely used for multimedia content on Apple platforms, and also in professional video editing.
Use the MOVSinkSettings class to set the parameters.
MP4 (MPEG-4 Part 14) is a digital multimedia container format used to store video, audio, and other data such as subtitles and images. It's widely used for sharing video content online and is compatible with a wide range of devices and platforms.
Use the MP4SinkSettings class to set the parameters.
MPEG-PS (MPEG Program Stream) is a container format for multiplexing digital audio, video, and other data. It is designed for reasonably reliable media, such as DVDs, CD-ROMs, and other disc media.
Use the MPEGPSSinkSettings class to set the parameters.
MPEG-TS (MPEG Transport Stream) is a standard digital container format for transmission and storage of audio, video, and Program and System Information Protocol (PSIP) data. It is used in broadcast systems such as DVB, ATSC and IPTV.
Use the MPEGTSSinkSettings class to set the parameters.
MXF (Material Exchange Format) is a container format for professional digital video and audio media, developed to address issues such as file exchange, interoperability, and to improve project workflow between production houses and content/equipment providers.
Use the MXFSinkSettings class to set the parameters.
OGG is a free, open container format designed for efficient streaming and manipulation of high quality digital multimedia. It is developed by the Xiph.Org Foundation and supports audio codecs like Vorbis, Opus, and FLAC, and video codecs like Theora.
Use the OGGSinkSettings class to set the parameters.
WAV (Waveform Audio File Format) is an audio file format standard developed by IBM and Microsoft for storing audio bitstreams on PCs. It is the main format used on Windows systems for raw and typically uncompressed audio.
Use the WAVSinkSettings class to set the parameters.
WebM is an open, royalty-free, media file format designed for the web. WebM defines the file container structure, video and audio formats. WebM files consist of video streams compressed with the VP8 or VP9 video codecs and audio streams compressed with the Vorbis or Opus audio codecs.
Use the WebMSinkSettings class to set the parameters.
RTMP (Real-Time Messaging Protocol): Developed by Adobe, RTMP is a protocol used for streaming audio, video, and data over the Internet, optimized for high-performance transmission. It enables efficient, low-latency communication, commonly used in live broadcasting like sports events and concerts.
Use the RTMPSinkSettings class to set the parameters.
HLS (HTTP Live Streaming) is an HTTP-based adaptive streaming communications protocol developed by Apple. It enables adaptive bitrate streaming by breaking the stream into a sequence of small HTTP-based file segments, typically using MPEG-TS fragments as the container.
Use the HLSSinkSettings class to set the parameters.
HTTP MJPEG (Motion JPEG) Live is a video streaming format where each video frame is compressed separately as a JPEG image and transmitted over HTTP. It is widely used in IP cameras and webcams due to its simplicity, although it is less efficient than modern codecs.
Use the HTTPMJPEGLiveSinkSettings class to set the parameters.
NDI (Network Device Interface) is a royalty-free video transport standard developed by NewTek that enables video-compatible products to communicate, deliver, and receive broadcast-quality video in a high-quality, low-latency manner over standard Ethernet networks.
Use the NDISinkSettings class to set the parameters.
SRT (Secure Reliable Transport) is an open source video transport protocol that enables the delivery of high-quality, secure, low-latency video across unpredictable networks like the public internet. It was developed by Haivision.
Use the SRTSinkSettings class to set the parameters.
SRT MPEG-TS is a combination of the SRT transport protocol with MPEG-TS container format. This allows secure, reliable transport of MPEG-TS streams over public networks, which is useful for broadcast and professional video workflows.
Use the SRTMPEGTSSinkSettings class to set the parameters.
YouTube Live is a live streaming service provided by YouTube. It allows creators to broadcast live videos to their audience through the YouTube platform.
Use the YouTubeSinkSettings class to set the parameters.
Shoutcast is a service for streaming media over the internet to media players, using its own cross-platform proprietary software. It allows digital audio content, primarily in MP3 or High-Efficiency Advanced Audio Coding (HE-AAC) format, to be broadcast. The most common use of Shoutcast is for creating or listening to Internet audio broadcasts.
Use the ShoutcastSinkSettings class to set the parameters.
// Pipelinevar pipeline =newMediaBlocksPipeline();// Audio source (e.g., from a file with MP3/AAC or raw audio)var universalSource =newUniversalSourceBlock(await UniversalSourceSettings.CreateAsync(newUri("input.mp3")));// Or use VirtualAudioSourceBlock for live raw audio input:// var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings { Channels = 2, SampleRate = 44100 });// Optional: Audio Encoder (if source is raw audio or needs re-encoding for Shoutcast)// Example: MP3EncoderBlock if Shoutcast server expects MP3var mp3Encoder =newMP3EncoderBlock(newMP3EncoderSettings(){ Bitrate =128000});// Bitrate in bps
pipeline.Connect(universalSource.AudioOutput, mp3Encoder.Input);// If using VirtualAudioSourceBlock: pipeline.Connect(audioSource.Output, mp3Encoder.Input);// Shoutcast sink// Configure the Shoutcast/Icecast server connection detailsvar shoutcastSettings =newShoutcastSinkSettings{
IP ="your-shoutcast-server-ip",// Server hostname or IP address
Port =8000,// Server port
Mount ="/mountpoint",// Mount point (e.g., "/stream", "/live.mp3")
Password ="your-password",// Source password for the server
Protocol = ShoutProtocol.ICY,// ShoutProtocol.ICY for Shoutcast v1/v2 (e.g., icy://)// ShoutProtocol.HTTP for Icecast 2.x (e.g., http://)// ShoutProtocol.XAudiocast for older Shoutcast/XAudioCast// Metadata for the stream
StreamName ="My Radio Stream",
Genre ="Various",
Description ="My awesome internet radio station",
URL ="http://my-radio-website.com",// Homepage URL for your stream (shows up in directory metadata)
Public =true,// Set to true to list on public directories (if server supports)
Username ="source"// Username for authentication (often "source"; check server config)// Other stream parameters like audio bitrate, samplerate, channels are typically determined// by the properties of the encoded input audio stream fed to the ShoutcastSinkBlock.};var shoutcastSink =newShoutcastSinkBlock(shoutcastSettings);// Connect encoder's output (or source's audio output if already encoded and compatible) to Shoutcast sink
pipeline.Connect(mp3Encoder.Output, shoutcastSink.Input);// If source is already encoded and compatible (e.g. MP3 file to MP3 Shoutcast): // pipeline.Connect(universalSource.AudioOutput, shoutcastSink.Input);// Start the pipelineawait pipeline.StartAsync();// For display purposes, you can construct a string representing the connection:string protocolScheme = shoutcastSettings.Protocol switch{
ShoutProtocol.ICY =>"icy",
ShoutProtocol.HTTP =>"http",
ShoutProtocol.XAudiocast =>"xaudiocast",// Note: actual scheme might be http for XAudiocast
_ =>"unknown"};
Console.WriteLine($"Streaming to Shoutcast server: {protocolScheme}://{shoutcastSettings.IP}:{shoutcastSettings.Port}{shoutcastSettings.Mount}");
Console.WriteLine($"Stream metadata URL (for directories): {shoutcastSettings.URL}");
Console.WriteLine("Press any key to stop the stream...");
Console.ReadKey();// Stop the pipeline (important for graceful disconnection and resource cleanup)await pipeline.StopAsync();