Introduction
GStreamer is a framework for creating streaming media applications. With the GStreamer framework, it is possible to design and build low-latency applications that can handle any kind of streaming data flow, including both audio and video capabilities.
GStreamer core provides a framework for plugins, data flow, and media type handling. It also provides an API to write applications using the various plugins.
The plugin-based framework provides various codec and other functionalities that can be linked and arranged in a pipeline, which defines the flow of the data. However, as of the time of writing, there is no official Node.js port/binding for GStreamer applications.
In this post, we’ll talk about GStreamer’s functionalities and setting it up with Node through the following sections:
- Features and use cases
- GStreamer plugin architecture
- GStreamer installation
- Setting up GStreamer with Node.js
- Some GStreamer limitations
Features and use cases
One of the major use cases of GStreamer is for building media players. As a developer, you can use an extensive set of powerful tools to create media pipelines without writing a single line of code.
By default, GStreamer includes components for building a media player with support for a very wide variety of formats, including MP3, Ogg/Vorbis, MPEG-1/2, AVI, QuickTime, mod, and so on.
GStreamer, however, offers much more than other media players. Its main advantages are that the pluggable components can be mixed into arbitrary pipelines so that it’s possible to write a video or audio editing application. Specifically, GStreamer provides:
- An API for multimedia applications
- A plugin architecture
- A pipeline architecture
- A mechanism for media type handling/negotiation
GStreamer plugin architecture
Via plugins, GStreamer can bridge to other multimedia frameworks to reuse existing components (e.g. codecs) and other platform input/output mechanisms.
The core of GStreamer is essentially media-agnostic. It only knows about bytes and blocks, and only contains basic elements. All of the media-handling functionality is provided by plugins external to the core. These tell the core how to handle specific types of media.
GStreamer plugins can be classified into the following groups:
- Protocols handling
- Sources: for audio and video (involves protocol plugins)
- Formats: parsers, formaters, muxers, demuxers, metadata, subtitles
- Codecs: coders and decoders
- Filters: converters, mixers, effects
- Sinks: for audio and video (involves protocol plugins)
GStreamer Installation
To install GStreamer, the documentation provides various approaches to choose from based on your choice of operating system.
For macOS X, we need to install OSX Snow Leopard (10.6) or later and XCode 3.2.6 or later. However, the recommended system version is macOS Sierra with XCode 8. We can install both the runtime and the development installers from the GStreamer download page, which we can find here.
For other operating systems/environments, including Windows, iOS, Android, and Linux, we can have a look at the downloads page in the documentation, which consists of a list of all supported ecosystems and ways of building GStreamer SDKs for developmental purposes.
We can navigate to the /Library/Frameworks/GStreamer.framework/Commands
on our system path to have a look at available commands to play around with GStreamer. Some of the popular commands include gst-launch-1.0
, gst-inspect-1.0
, gst-play-1.0
.
Setting up GStreamer with Node.js
After the installation, we can go ahead to make use of the Node.js runtime to consume a GStreamer pipeline and output the result to a web browser.
Let’s create a folder of our choice and install Express.js using npm or Yarn and follow the instructions to setup a basic project with a package.json
file:
npm install express or yarn add express
Then go ahead to create an index.js
file to hold the JavaScript code for our streaming example with GStreamer. See the index.js
file below:
const express = require('express') const http = require('http') const net = require('net'); const child = require('child_process'); const app = express(); app.use(express.static(__dirname + '/')); const httpServer = http.createServer(app); const port = 3000; //send the html page which holds the video tag app.get('/', function (req, res) { res.send('index.html'); }); //stop the connection app.post('/stop', function (req, res) { console.log('Connection closed using /stop endpoint.'); if (gstMuxer != undefined) { gstMuxer.kill(); //kill the GStreamer Pipeline } gstMuxer = undefined; res.end(); }); //send the video stream app.get('/stream', function (req, res) { res.writeHead(200, { 'Content-Type': 'video/webm', }); const tcpServer = net.createServer(function (socket) { socket.on('data', function (data) { res.write(data); }); socket.on('close', function () { console.log('Socket closed.'); res.end(); }); }); tcpServer.maxConnections = 1; tcpServer.listen(function () { console.log("Connection started."); if (gstMuxer == undefined) { console.log("inside gstMuxer == undefined"); const cmd = 'gst-launch-1.0'; const args = getGstPipelineArguments(this); const gstMuxer = child.spawn(cmd, args); gstMuxer.stderr.on('data', onSpawnError); gstMuxer.on('exit', onSpawnExit); } else { console.log("New GST pipeline rejected because gstMuxer != undefined."); } }); }); httpServer.listen(port); console.log(`Camera Streaming App listening at http://localhost:${port}`) process.on('uncaughtException', function (err) { console.log(err); }); //functions function onSpawnError(data) { console.log(data.toString()); } function onSpawnExit(code) { if (code != null) { console.log('GStreamer error, exit code ' + code); } } function getGstPipelineArguments(tcpServer) { const args = ['/Users/alexandernnakwue/Downloads/samplevideo.mp4', 'pattern=ball', '!', 'video/x-raw,width=320,height=240,framerate=100/1', '!', 'vpuenc_h264', 'bitrate=2000', '!', 'mp4mux', 'fragment-duration=10', '!', 'tcpclientsink', 'host=localhost', 'port=' + tcpServer.address().port]; return args; }
As we can see in the file above, we have three endpoints:
- An endpoint to send the HTML page, which holds the video tag
- An endpoint to send the video stream
- An endpoint to end the connection
Next, create the HTML page (index.html
), which holds the video tag as shown below.
<!DOCTYPE html>
<head>
<title>GStreamer with NodeJS Demo</title>
<meta name="viewport" content="width=device-width, initial-scale=0.9">
<style>
html,
body {
overflow: hidden;
}
</style>
<script>
function buffer() {
//Start playback as soon as possible to minimize latency at startup
const dStream = document.getElementById('vidStream');
try {
dStream.play();
} catch (error) {
console.log(error);
}
}
</script>
</head>
<body onload="buffer();">
<video id="vidStream" width="640" height="480" muted>
<source src="/stream" type="video/mp4" />
<source src="/stream" type="video/webm" />
<source src="/stream" type="video/ogg" />
<!-- fallback -->
Your browser does not support the video
element. </video> </body>
As I mentioned in the introduction, there are currently no official ports or bindings for Node.js. The code above is adapted from this Stack Overflow post.
We can proceed to use the gst-launch-1.0
command to start the streaming app and the arguments, which include the video or audio source for streaming, the TCP port and address, settings, and so on. As the page loads, we are playing the video stream as soon as possible with the play()
method.
Note: This only works in Chromium-based browsers. I’ll explain more below.l
Some Gstreamer limitations
Today’s current implementation of GStreamer for Node.js is non-standardized and still lacking. For instance, the current implementation is not fully browser compatible and only works in Chromium-based ones, since some of the HTTP headers needed in Chrome to load resources are not available. Additionally, it is still a difficult task to build GStreamer on some system architecture because it still contains lots of bugs.
GStreamer does not yet support ports for multiple different programming languages directly. This means that developers who intend to make use of GStreamer in Node.js applications need to use the node-addon-api
to call C
code from Node directly. This method, however, requires lots of work and can be especially error-prone when building with node-gyp.
Conclusion
As we may have noticed, there are limited GStreamer bindings for Node.js today.
There are other bindings available, such as the node-gstreamer-superficial, but according to the documentation, it does not attempt to be a complete JS binding for GStreamer, and will hopefully one day be replaced by (or implemented with) node-gir
.
Other available bindings or hacks just do not work as expected, are not standardized, or are bug-prone. This is indeed a huge challenge and in the near future, a standardized and industry-wide port of the Node.js binding needs to be built.
The post Using Gstreamer in Node.js appeared first on LogRocket Blog.
from LogRocket Blog https://ift.tt/N3ci4mZ
Gain $200 in a week
via Read more