Make a WebSocket Stream
Nvision with Streaming Videos
Last updated
Was this helpful?
Nvision with Streaming Videos
Last updated
Was this helpful?
WebSocket stream is another way to submit images to the Nvision service. To use the WebSocket stream, the easiest way is to use our provided SDK, see .
To use Nvision with streaming videos, your application needs to implement the following:
: Develop websocket stream using SDK.
: Create a webhook url to receive HTTP callback.
: in the Nvision service page.
Some application need to be run as a Headless agent on edge compute. For example, you develop a headless agent running on a RaspberryPi to read video frames from CCTV and submit them to the Nvision service as the following diagram.
Initialize your NPM project, then install @nipacloud/nvision
and opencv4nodejs
using npm
or yarn
command.
To get results, you need to configure a Webhook endpoint to your service. Webhook is a user-defined HTTP callback endpoint.
Headless Agent
By running this command ngrok http 3000
, you will get public URLs as follows:
As the socket protocol is used, we provide a custom callback endpoint configuration that allows you to have independent backends for receiving and analyzing prediction results.
WebSocket Streaming Callback URL
Now, input your exposed URL to the WebSocket streaming callback URL in service's setting.
Lastly, when the image is processed, the Nvision service will make a HTTP request to the specified endpoint with the prediction results structured as follow.
Method: POST
Body: The request body will be provided as
Output Logs from Streaming agent (left) and Webhook callback (right)
For testing webhook integration, we use to create a secure tunnel on local machine along with public URLs for exposing local web server.