html5直播

HTML5 LIVE VIDEO STREAMING VIA WEBSOCKETS

When I built my Instant Webcam App, I was searching for solutions to stream live video from the iPhone's Camera to browsers. There were none.

When it comes to (live) streaming video with HTML5, the situation is pretty dire. HTML5 Video currently has no formalized support for streaming whatsoever. Safari supports the awkward HTTP Live Streaming and there's an upcomming Media Source Extension standard as well as MPEG-DASH. But all these solutions divide the video in shorter segments, each of which can be downloaded by the browser individually. This introduces a minimum lag of 5 seconds.

So here's a totally different solution that works in any modern browser: Firefox, Chrome, Safari, Mobile Safari, Chrome for Android and even Internet Explorer 10.

live view at recording from our office in Darmstadt, Germany. For a live streaming example, please check the free iOS app instead.

It's quite backwards, uses outdated technology and doesn't support audio at the moment. But it works. Surprisingly well.

The Camera Video is encoded to MPEG by ffmpeg on a local machine and then sent to a public webserver via HTTP. On the webserver a tiny nodejs script simply distributes the MPEG stream via WebSockets to all connected Browsers. The Browser then decodes the MPEG stream in JavaScript and renders the decoded pictures into a Canvas Element.

You can even use a Raspberry Pi to stream the video. It's a bit on the slow side, but In my tests it had no problem encoding 320x240 video on the fly with 30fps. This makes it the, to my knowledge, best video streaming solution for the Raspberry Pi right now.

Here's how to set this up. First get a current version of ffmpeg. Up to date packages are available at deb-multimedia. If you are on Linux, your Webcam should be available at /dev/video0 or /dev/video1. On OSX or Windows you may be able to feed ffmpeg through VLC somehow.

Make sure you have nodejs installed on the server through which you want to distribute the stream. Get the stream-server.js script from jsmpeg.

Now install its dependency to the ws WebSocket package and start the server with a password of your choosing. This password is there to ensure that no one can hijack the video stream:

npm install ws
node stream-server.js yourpassword

You should see the following output when the server is running correctly:

Listening for MPEG Stream on http://127.0.0.1:8082/<secret>/<width>/<height>
Awaiting WebSocket connections on ws://127.0.0.1:8084/

With the nodejs script started on the server, you can now start ffmpeg on the local machine and point it to the domain and port where the nodejs script is running:

ffmpeg -s 640x480 -f video4linux2 -i /dev/video0 -f mpeg1video \
-b 800k -r 30 http://example.com:8082/yourpassword/640/480/

This starts capturing the webcam video in 640x480 and encodes an MPEG video with 30fps and a bitrate of 800kbit/s. The encoded video is then sent to the specified host and port via HTTP. Make sure to provide the correct secret as specified in the stream-server.js. The width and height parameters in the destination URL also have to be set correctly; the stream server otherwise has no way to figure out the correct dimensions.

On the Raspberry Pi you will probably have to turn down the resolution to 320x240 to still be able to encode with 30fps.

To view the stream, get the stream-example.html and jsmpg.js from the jsmpeg project. Change the WebSocket URL in the stream-example.html to the one of your server and open it in your favorite browser.

If everything works, you should be able to see a smooth camera video with less than 100ms lag. Quite nice for such hackery and a humble MPEG decoder in JS.

Again, for an easier to use solution, check the Instant Webcam App.

Wednesday, September 11th 2013

— Dominic Szablewski, @phoboslab

 

128 Comments:

#1 – meh – Wednesday, September 11th 2013, 16:42

What a novel solution, but what a fucking backwards step. It's 2013 and we still don't have ubiqous live streaming across browsers. end rant.

#2 – dude – Wednesday, September 11th 2013, 17:07

what about webRTS? Can't it be leveraged somehow?

#3 – Simon – Wednesday, September 11th 2013, 17:19

How have you managed to get the encoding on the iPhone that fast?
I tried to build something like this for a small project. I compiled ffmpeg for iOS and used it to encode a video stream. Maybe I did something wrong, but the encoder couldn't keep up with the camera, the resulting video was always very bad.

In the end, I just convert each video frame to a jpeg image. This is hardware accelerated on iOS devices and I managed to get ~20 fps on my local wifi. No fancy JavaScript mpeg decoder needed, I just reload the image every 0,05s and draw it on a canvas

#4 – john john – Wednesday, September 11th 2013, 17:23

It's all fun and all, but there is no audio :P

#5 – joel – Wednesday, September 11th 2013, 17:25

Hello, i am a research of video streaming. And i like your work !! 

#6 – john john – Wednesday, September 11th 2013, 17:26

I never met a research before

#7 – Robert Swain – Wednesday, September 11th 2013, 17:27

Have you looked at fragmented mp4 or WebM?

#8 – wedtm – Wednesday, September 11th 2013, 17:43

What's the difference between this and WebRTC?

#9 – Dominic – Wednesday, September 11th 2013, 18:08

@Simon: I made the same discovery at first. Encoding MPEG1 seemed very slow on iOS. However, I found out that the ffmpeg build script I used disabled all ARM optimizations. Maybe you had the same issue:
github.com/kolyvan/kxmovie/issues/55#issuecomment-20847017

With this, I got about ~20fps of 320x240 video on the iPhone4. I then used the profiler to find some bottlenecks in ffmpeg and re-wrote some of them with ARM NEON instructions. I'll probably write a blog post about this in the coming days.

@wedtm: WebRTC isn't available everywhere (this here works on iOS and android for instance) and implementing it in your own native App (i.e. like my iPhone App) seems impossibly complicated. I still have some research to do about this, though.

#10 – Ray Brooks – Wednesday, September 11th 2013, 18:31

Hey, this is great news! I was trying for an HTML5 streaming solution for a while (woeful native support drove me elsewhere and I will endeavour to check out your technique as soon as I can), but in the meantime, have you seen my streaming solution on the R-Pi forum? It uses a Strobe Player for H264 decoding, which the Pi supplies natively, and I have full HD video at 30fps in the browser with less than 1/2 second delay.

#11 – Christopher Elwell – Wednesday, September 11th 2013, 18:36

Works surprisingly well. But, as stated, no audio?

#12 – foo – Wednesday, September 11th 2013, 19:04

Can you not do this leveraging the GPU on the Pi? ffmpeg -f h264 instead or does jsmpeg not support h264?

#13 – Ankur Oberoi – Wednesday, September 11th 2013, 19:11

Great job getting this to work on low-scale hardware!

If you or anyone else is still trying to get video streaming from iOS to a browser (or vice-versa, or iOS to iOS, or web to web even) OpenTok is a pretty great solution.

#14 – Criação – Wednesday, September 11th 2013, 19:20

A nice solution anyway! Thanks a lot for this!

#15 – Anthony Catel – Wednesday, September 11th 2013, 19:31

It's awkward that in 2013 we have to complexify design to this point :
"Streaming video through WebSocket (thus over http upgrade) through a plain-tcp-to-websocket-proxy on top of...".
It feels over-engineered and like TCP-Sockets are a revolution. I know, we can't easily expose plain-socket because of security but still... What's next? UDP for 2018? Socket-binding for 2022?
(Nice work though)

#16 – Dan – Wednesday, September 11th 2013, 20:17

Excellent solution. But you should leave a light on over night. Maybe shine it on your brand ;) I had to take a hard look to realize that there were shapes in the black background and the stream wasn't somehow broken. 
 

#17 – Lindsay – Wednesday, September 11th 2013, 20:54

A similar approach that worked well for me about a year ago using almost the same technology and stack. Instead of pushing the mpeg stream via websockets though I simply ran an ffmpeg capture loop of jpeg images into a ramdisk and pushed those down the websocket pipe using nodejs. You can render those very quickly (15+ fps depending on image size) on a canvas without any special js decoders on the client-side. The result was a very lightweight real-time remote system desktop (1080) or camera monitor that worked well over low bandwidth. Since all it was really doing was serving static images, it also scaled really well and worked over websockets and long-polling connections. The long-polling actually worked better as the client could adjust for network response times -- slightly different url gave current frame with different quality.

#18 – namuol – Thursday, September 12th 2013, 06:41

You've made a career out of trampling the slow-moving, inconsistent world of HTML "standards" in the name of good web experiences. Once again, bravo. :)

#19 – balamaci – Friday, September 13th 2013, 00:18

Thanks for this, it's great stuff.

Ran on raspberry PI with:
raspivid -t 999999 -fps 15 -vf -w 320 -h 240 -o - | avconv -f h264 -i pipe:0 -f mpeg1video -b 500k -r 25 127.0.0.1:8082/s3cret/320/240/

but the delay is somewhere around 10sec and avconv is eattin up all the cpu.

#20 – Engleek – Friday, September 13th 2013, 14:48

I'm wondering if you couldn't achieve the same thing using a GIF file with no specified number of frames.

I saw this technique used to emulate web sockets [hackaday.com/2013/02/07/gifsockets-websockets-using-animated-gif-files/], so presumably it could be used to stream video at low frame rates with little bother.

#21 – ImDeity – Saturday, September 14th 2013, 06:44

Works great, thanks for making this.

Made a live player stream for our Minecraft server:
stream.imdeity.com/

#22 – Phil – Saturday, September 14th 2013, 17:49

Installed. Love it.

#23 – John Doe – Sunday, September 15th 2013, 18:29

Installed on iphone4 and tried to connect via my ipad 1; doesnt work :(

#24 – Max – Sunday, September 15th 2013, 19:15

Great App! 
Could you describe how this technique can be used to embed the stream on a website in the www?

#25 – Jdog – Sunday, September 15th 2013, 22:12

hey there 
i tried streaming the outpu of the iOS app via internet to another IP but it seem like im to stupid to manage that....
i opened ports an forwarded them to my device but nothing happens...
can somone help me?

#26 – Florian – Monday, September 16th 2013, 13:07

Hi,

Just wanted to say thank you for giving this app away for free. It's a great little app! I'm pretty sure it will help me find out who's regularly stealing my newspaper since a couple of months. :)

Viele Grüße!

#27 – vivian – Tuesday, September 24th 2013, 09:49

Why nobody use 800li media server? Just click and copy-paste just make a complete live broadcasting stream. It produces HTML code and SWF playback address. It doesn't have any fame but I tried it. Work well!!!!!

#28 – vivian – Tuesday, September 24th 2013, 09:53

I have to add another post: 800li server software can support Android and iOS viewing!!! I found this yesterday!

#29 – Max – Thursday, October 3rd 2013, 13:23

Hey, can you describe more how iOS app works? ffmpeg + nodesj?

#30 – Dominic – Tuesday, October 8th 2013, 16:14

The iOS App uses ffmpeg for encoding and libwebsockets to serve the static files (.html, .css, .js) and of course the video stream:
libwebsockets.org/

#31 – imbm – Wednesday, October 16th 2013, 13:26

There is memory leaking in your jsmpeg.js. When you have your broswers. Chrome runs longer but FireFox would be soon to grow to 2GB in a short time.

#32 – imbm – Wednesday, October 16th 2013, 13:33

I mean it would freeze when you have your browser stay longer and keep monitoring your webcam. It's a great job though.

#33 – EddyF – Wednesday, November 6th 2013, 23:56

When I try to stream the mpg video I get an error message with the following: "Option video_size not found."
I am using 
ffmpeg -s 640/480 -f mpeg -i /dev/video0 -f mpeg1video -b 800k -r 30 my-IP-Address:8082/pass/640/480

I've tried several different heights and widths, any ideas?
Thank you!

#34 – K Buckingham – Saturday, November 9th 2013, 02:53

This is the coolest thing I've ever seen.

#35 – surgemcgee – Tuesday, November 12th 2013, 15:54

Dude, cool. Well done with the canvas. Tickled by this..

#36 – chu – Saturday, November 23rd 2013, 15:13

wow, the app is so nice!
i wonder how you used ffmpeg on ios.

#37 – Mano – Sunday, November 24th 2013, 23:19

what if want to convert rtmp based stream coming from AMS and then sending it over websocket server to clients ?

#38 – Reynold – Monday, December 2nd 2013, 05:34

Great work! Thank you so much :)

#39 – Russell – Saturday, December 14th 2013, 22:50

Sometimes older, simpler, faster is the magic combo
Instead of bleeding edge. All I wanted was a very simple
X platform client to client video stream for recording studio
Links. This is great for my iPhone cam but I'm gonna
See if I can implement this on Linux (my recording system)
And also use the iSight cam on my old
MacBook. Don't need the audio.... Just fast, simple, LAN video!!
Nice app!!

#40 – FRED – Friday, December 20th 2013, 09:37

Woderfull! What a nice app. How can I capture teh full hd? Any suggestions? i want to use it for stopmotion an other trickfilms

#41 – Jason – Friday, January 17th 2014, 13:29

This is absolutely fantastic – thanks for taking the time to put the demo together. I'd love to use it in a project of my own if possible. What is the license for jsmpeg?

#42 – John Tattersall – Thursday, February 13th 2014, 15:14

Brilliant Dominic!
I was wondering if this method is still the best way for Pi > public and responsive online video streaming in 2014 ?
I am building an educational garden robot for kids : sprigawatt.com and need to get this happening asap - thanks!
I'm based in Berlin, so if you know anyone here who might be helpful that'd be amazing too - thanks! John

#43 – Peter Sorensen – Friday, February 14th 2014, 12:41

You could simply setup your server (if fast enough/depending on camera source codec) to encode/mux the content to a browser supported container like mp4 with h264 video and aac/mp3/ogg/wav audio, all depending on browser...

This is technically not live, the browser sees it as a video with an unknown duration...

All depending on how the browser buffers and stores loaded data, this may cause big temporary files... Though it may make it possible to pause and continue the live stream, as it's theoretically recorded and then played back... (if the browser stores incoming data)

#44 – Amir – Sunday, February 16th 2014, 16:20

Hi guys,
I need stream by anything to my server and display on mobile.
Is there any solution?

#45 – Viktoria – Sunday, February 16th 2014, 16:57

Hi, excellent tutorial, but what about the audio please? I would like to setup something like skype on my server, is that possible at all?

#46 – Stefano – Thursday, February 20th 2014, 14:49

Dominic, You rock!!!
Thanks for sharing your knowledge.
I tried Instant Webcam too: cool!

#47 – vlad – Monday, March 3rd 2014, 22:17

hello. would you like to help me building a video streaming server to include it with a mobile app? let me know ask@workedo.com

#48 – Michael Romanenko – Tuesday, March 11th 2014, 21:37

Great idea, superb implementation. Instant Webcam is fantastic! I wish JSMPEG become more mature and gain community support!

#49 – curiosul – Tuesday, March 25th 2014, 15:29

I'm a beginner and your tutorial looks hard for me. How to install WebSocket package on my raspberry? Did you have any resource for a totally beginner? I'm trying to broadcast from my raspberry pi to a c# application... . Thanks.

#50 – Learning – Friday, March 28th 2014, 18:52

Very good job, I'm new to this subject so I was wondering if there is a way to clear the password or at least know what this default, since I do not get your code running altogether .. help please. Thank you.

#51 – Alex Cohn – Friday, April 11th 2014, 15:00

Thanks! I used your approach to answer a related question.

#52 – nosretep – Monday, April 14th 2014, 08:51

Awesome! Thanks!

#53 – bpersich – Monday, April 14th 2014, 17:30

What do I actually do with the jsmpg.js and stream-server.html files? Does jsmph.js go in the same directory as stream-server.js? Do I need to run it in node like I did with stream-server.js?

#54 – nosretep – Monday, April 14th 2014, 20:39

find stream-server.html in your file system and simply open it in your browser (which includes jsmpg.js). it will have the file:/// stuff in the address bar ...

you'll have to figure out on your own how to serve stream-server.html (and jsmpg.js) via webserver, I don't think that's an aspect that they wanted/needed to spend time on in this tutorial.

#55 – Ithorion – Saturday, April 19th 2014, 15:41

If you are using a Raspberry Cam,
you can't access to /dev/video0 directly.
I had to install video4linux2 driver & follow the steps describe in :
www.linux-projects.org/modules/sections/index.php?op=viewarticle&artid=14

Thanks a lot for sharing this !! It's awesome !! i'll try to add OpenCV & do some transformation in between !

#56 – bryce – Monday, April 21st 2014, 21:53

Is it possible using this method to stream multiple videos? Do we have to modify the code to do this, or use multiple ports and sockets

#57 – Ignacio – Monday, May 12th 2014, 11:22

Did anyone got the webcam stream running on Mac OSX?

Really struggling with this...

Thanks in advance

#58 – Ingvar Stepanyan – Wednesday, May 14th 2014, 22:49

Hi, you might be also interested in checking out github.com/RReverser/mpegts - pure frontend JavaScript HTTP Live Streaming realtime converter and player which performs realtime conversion of MPEG-TS video chunks to MPEG-4 in a separate thread using Web Worker and playing them in order in the main one.

#59 – Mike Purvis – Wednesday, May 21st 2014, 21:02

Thank you for posting this, it's a great resource. FYI, it does work on Windows, ffmpeg and dshow for input. For example on the input ffmpeg -s 320x240 -f dshow -i video="Logitech HD Pro Webcam C920"

#60 – mike – Saturday, May 24th 2014, 08:48

when i try to install ws uing "npm install ws" , i get the following error

npm http GET registry.npmjs.org/ws

npm ERR! Error: failed to fetch from registry: ws
npm ERR! at /usr/share/npm/lib/utils/npm-registry-client/get.js:139:12
npm ERR! at cb (/usr/share/npm/lib/utils/npm-registry-client/request.js:31:9)
npm ERR! at Request._callback (/usr/share/npm/lib/utils/npm-registry-client/request.js:136:18)
npm ERR! at Request.callback (/usr/lib/nodejs/request/main.js:119:22)
npm ERR! at Request.<anonymous> (/usr/lib/nodejs/request/main.js:212:58)
npm ERR! at Request.emit (events.js:88:20)
npm ERR! at ClientRequest.<anonymous> (/usr/lib/nodejs/request/main.js:209:10)
npm ERR! at ClientRequest.emit (events.js:67:17)
npm ERR! at ClientRequest.onError (/usr/lib/nodejs/request/tunnel.js:164:21)
npm ERR! at ClientRequest.g (events.js:156:14)
npm ERR! You may report this log at:
npm ERR! <bugs.debian.org/npm>
npm ERR! or use
npm ERR! reportbug --attach /home/suresh/npm-debug.log npm
npm ERR! 
npm ERR! System Linux 3.8.0-39-generic
npm ERR! command "node" "/usr/bin/npm" "install" "ws"
npm ERR! cwd /home/suresh
npm ERR! node -v v0.6.12
npm ERR! npm -v 1.1.4
npm ERR! message failed to fetch from registry: ws
npm ERR! 
npm ERR! Additional logging details can be found in:
npm ERR! /home/suresh/npm-debug.log
npm not ok

can anyone tell me why this happens??\

thanks in advance.

#61 – harlix – Monday, May 26th 2014, 17:22

Nice work! I'm just fiddling around with a libwebsockets based webserver for embedded devices in C and tried to stream some video without the need of a GUI and node.js on the server side. The MJPEG stream of a Logitech C170 cam is transcoded to MPEG1 by FFmpeg and then pushed into a named pipe, where the server periodically reads from and broadcasts the MPEG data to all websocket clients. It works on the Raspberry Pi with

ffmpeg -f v4l2 -y -s 320x240 -r 20 -c:v mjpeg -i /dev/video0 -f mpeg1video -q:v 6 -vf "crop=iw-mod(iw\,2):ih-mod(ih\,2) -an -b:v 0 -b:a 0 /tmp/videopipe.mpg

but with a delay of about 2 seconds and FFmpeg eating up 90% CPU. A better solution is the BeagleBone which has an ARM Cortex-A8 CPU with NEON technology and causes much less delay. Both devices are running headless ArchLinux for ARM.

 

#62 – Green Energy Dude – Friday, May 30th 2014, 00:19

I see that this App is putting the HTML5 browser support to the test , good job keep the faith....

#63 – Santosh Singh – Wednesday, June 4th 2014, 12:37

This is excellent.
I managed to install the packages and stream the Raspberry Pi Camera through my webpage. The only challenge is the speed of video is slow and late (almost by 10 to 30secs). 
I am using avconv instead of ffmpeg as the original ffmpeg command stated to use avconv instead.

Is there a difference in using avconv v/s building ffmpeg from scratch?

#64 – malin – Friday, June 6th 2014, 12:13

Thanks a lot for this.

But, how can I send a mpeg1 file from a c++ programm using opecv? To remplace ffmpeg tool?

Plx

#65 – Tracy – Sunday, June 22nd 2014, 00:51

I've got everything up until the point where I change the default password at the top of the file. I'm probably blind, but I just don't see that line.

#66 – bolosse – Monday, June 23rd 2014, 20:17

It's when you run your app:

node stream-server.js yourpassword

#67 – assman – Thursday, July 3rd 2014, 16:06

Live stream my ass ! lol .. a loop video .. that what is this..

#68 – roboticfan – Thursday, July 10th 2014, 12:23

Hi mike (#60 comment)

I had the same problem as you, here is the answer to your problem. 

Write this down on your terminal first : 
npm config set registry registry.npmjs.org/

then you can install anything using npm install. 

Although I appreciate the work, I find pretty unpolite from the authors of the article that they draft the installation process so quickly : nodejs is not THAT simple to install !

#69 – roboticfan – Thursday, July 10th 2014, 12:26

Me again, just a correction, the correct link is :

npm config set registry registry.npmjs.org/

Hope I didn't answer too late (I could see the comment is from May)

#70 – roboti – Thursday, July 10th 2014, 12:28

Geez, last spam, sorry : the website automatically convert registry.npmjs.org but you have to type "" (without the quotes) in front of registry. The full answer can be found here : 
stackoverflow.com/questions/12913141/installing-from-npm-fails/13119867#13119867

#71 – Mo-Che Chan – Monday, July 21st 2014, 06:51

I have tried this jsmpg successfully. However, I fail to extend this example code to multiple stream at once so far.

#72 – Bruteche – Wednesday, August 6th 2014, 18:23

Genius !! Thanks guys. Will be awesome for my FPV.

#73 – Steve – Friday, August 29th 2014, 20:51

I'm using ffmpeg and node.js on windows 7 and managed to install an make everything work, except for this command

ffmpeg -s 640x480 -f video4linux2 -i /dev/video0 -f mpeg1video \
-b 800k -r 30 example.com:8082/yourpassword/640/480/

Does any one know how to convert this command to windows and its drivers please?

#74 – Seif – Friday, September 26th 2014, 14:48

Can i use this nodejs server to stream multiple video feeds?

#75 – Vance – Thursday, October 16th 2014, 18:14

Hi, I was able to get your setup to work with a Macbook using the avfoundation on ffmpeg. When using the following, the web browser gets a choppy picture with lines and double images. Is there something wrong that I'm doing?

// Command to run the ffmpeg
ffmpeg -f avfoundation -i "0" -f mpeg1video -b:v 800k -r 30 127.0.0.1:8082/test

Any tips would be great. Thanks.

#76 – song – Friday, October 17th 2014, 09:27

Hi! Thank you for your article! Awesome!
I need to stream audio too... Is there any good way to do it?

#77 – Alex – Saturday, November 1st 2014, 00:44

Hi I am trying this now and I get a connection refused when I try to connect ws//localhost:3284/

#78 – jim – Tuesday, November 11th 2014, 06:48

how can I run this with a secure socket connection? (wss//) not ws//

#79 – mark – Saturday, November 15th 2014, 06:07

Hi, I implement your idea by ffmpeg and nodejs, but the video is not smooth. It can't play real time. Do I miss something?

#80 – Relfor – Friday, December 5th 2014, 04:56

Thanks a lot for jsmpeg! This non-flash solution is unique!

On my raspberry pi (Model: B+; OS: Raspbian) I used the following command:

avconv -f video4linux2 -y -s 320x240 -r 30 -i /dev/video0 -f mpeg1video -q:v 6 -vf "vflip" -an -b:v 0 -b:a 0 http://<ip>:<port>/<secret>/320/240



Note that you might not have

/dev/video

and

video4linux2

available by default. At least I didn't have them pre-installed. To install them, follow the link Ithorion (one of the commenters on this blog post) suggests: 
www.linux-projects.org/modules/sections/index.php?op=viewarticle&artid=14

 

#81 – Mariano – Tuesday, December 9th 2014, 01:49

Hi. This is awesome. But I'm having problems if i want to show the canvas with 1024*720. If I encode the streaming in that resolution my canvas still shows in 640*480. If I change the canvas css rule to width:1024px
height:720px
it gets pixelated.
Is there a way to have a canvas with the actual size of the streaming that is receiving from the server, in this case(1024*720)
cheers!

#82 – Aphex – Wednesday, December 10th 2014, 00:41

I get an error message that ffmpeg is depreciated. How to fix this?

#83 – Ramesh – Friday, December 19th 2014, 11:08

Hi this is Ramesh ,
I am using this tutorial to stream and record videos using live cams
Actually i have an issues that when camera streams i will indicate it fron camera ,reverse camera on top of the html stream,by the same way i need to indicate a message when camera is disconnected, can u help me

#84 – wozbit – Tuesday, March 17th 2015, 23:44

Does anybody know how to send the webcam ffmpeg stream using VLC only? I'm pretty sure similar options will then work on Windows and Mac version.

#85 – Galdo – Thursday, March 19th 2015, 20:29

I made every step, but when I open html.file, I didn't received anything. My server-streming is working good 
I use 
ffmpeg -s 640x480 -f dshow -i video="Lenovo EasyCamera" -f mpeg1video \ -b 800k -r 30 127.0.0.1:8082/galdog/640/480
and that give me how many times its repeated
[dshow @ 00000000044fd2e0] real-time buffer [Lenovo EasyCamera] [video input] too full or near too full (101% of size: 3041280 [rtbufsize parameter])! frame dropped!
Can somebody help me?? I use only one local machine and everything is on it

#86 – wozbit – Saturday, March 21st 2015, 11:21

try lowering the bitrate -b400k or less!

#87 – Galdo – Saturday, March 21st 2015, 15:27

that don't help me, could problem be that I make everything on one pc?

#88 – Galdo – Saturday, March 21st 2015, 15:41

sorry about my spamming, but I found where crash my system...That was \ after mpeg1video! When I removed it everything started working ;)

#89 – wozbit – Saturday, March 21st 2015, 22:11

Managed to get smooth video from webcam on a mac using qtkit instead of avfoundation. Im using ffmpeg from macports.

first find webcam device:
ffmpeg -f qtkit -list_devices true -i ""

change settings accordingly!

ffmpeg -f qtkit -i "0" -f mpeg1video -b:v 400k -r 30 localhost:8082/password/640/480/

All platforms supported now :D

#90 – master – Tuesday, April 14th 2015, 00:51

UV4L supports WebRTC Audio/Video streaming in Real-time (~ 200 ms) from the Rpi to any browser: www.linux-projects.org . No special configuration is required.

#91 – AudioMan – Friday, May 15th 2015, 09:03

For audio, have a look at stackoverflow.com/questions/3955103/streaming-audio-from-a-node-js-server-to-html5-audio-tag

The tag should be: "<audio autoplay controls src="localhost:8000/"></audio>"

#92 – 131 – Saturday, May 16th 2015, 23:58

i spend a whole week end on Media Source Extension, Media Recorder, WebM, MP4, RecordRTC & stuffs.

Then, i tried your solution.
It's working perfectly fine for my needs.

#93 – Cobraeti – Monday, June 8th 2015, 22:05

Hello,
That is intresting for a project of mine with a camera on a robot, but is it supposed to still work with NodeJS 0.12? Because I got a gray screen on the browser when testing the base code your are talking about in this article.

#94 – Cobraeti – Monday, June 8th 2015, 22:13

Sorry, I forgot to say that FFmpeg works fine when streaming to a file and that the server recognise (or seems to recognise) the entering stream and the client connection (corresponding logs in the console)

#95 – Andrew Simpson – Thursday, July 2nd 2015, 21:16

Hi,

I have a Windows 7 professional dev PC. I am going through the steps to set this up.

I have installed node.js and stream-server.js file.

When I type this is:

npm install ws


I get errors. I am hoping that if you look at these errors you will know instantly what is wrong?

I have posted a screenshot at the supplied uri. Please take a look when you?

Thanks
 

#96 – Francois Baret – Tuesday, July 14th 2015, 19:05

In this post (stackoverflow.com/questions/30736301/webcam-displayed-on-lan-not-to-the-internet) I explain that I get everything fine on the local network but not from the outside. Can you advise?
Thanks!

#97 – Francois Baret – Tuesday, July 14th 2015, 19:15

Sorry, but I already got your answer on your github forum!

#98 – Grauen – Tuesday, September 1st 2015, 19:57

Is it also possible with this to stream a clients webcam(video + audio) to a webserver using stream.js and make the stream available to other clients?

#99 – laurent – Saturday, October 3rd 2015, 22:00

Hello

Nice work ! 

i have tested on windows and mac os today, it is working very well, except that around frame 3000 ffmpeg breaks? (pipe broken).
I think something happened on the node.js server that break ffmpeg streaming after sometime?

Ffmpeg Error in : av_interleaved_write_frame():

Any idea ? I would like to put your server on a embedded system that will run forever.

Laurent

#100 – Niko Hallikainen – Sunday, October 25th 2015, 19:05

Can it stream audio too?

Regards, Niko H.

#101 – @Laurant: You should clear or increase the timeout of the connection via clearTimeout or via setTimeout (default timeout is set to 120000ms) – Thursday, October 29th 2015, 17:41

CoolMcCool

#102 – Laurent – Saturday, November 7th 2015, 11:31

Ok thank you I will try. The strange thing is that under Linux 
the timeout do not occure ?

Best Regards,

Laurent

#103 – Andrea – Thursday, December 3rd 2015, 13:45

I got an "Upgrade Required" message when I try to connect to localhost:8084 with chrome. What I'm doing wrong?

thank you.
Andrea

#104 – Daniel – Tuesday, December 22nd 2015, 20:06

@Laurent - I have the same issue, however it seems that it does not work in Linux in my case (Raspbian wheezy, Ubuntu 14.04). The broken pipe is not tied to the frame number so much as it is to the time - I consistently get the av_interleaved_write_frame() and broken pipe errors at almost exactly 2 minutes of successful streaming. I have varied the frame rate (24 & 30 fps) to verify this. Does anyone else have a solution to this problem?

@Andrea, Chrome is finicky about locally hosted sites - Firefox should be able to run it, but if you want to try it on Chrome, you might need to host it on a remote server or site.

#105 – laurent – Thursday, December 24th 2015, 21:29

still the same , timout on windows and mac at frame 3002 not on linux ?

I tryed to play with websocket timeout but I think I am not doing it good ?

Thank you

#106 – laurent – Friday, December 25th 2015, 00:10

just find that it was the nodejs version. working when version is 0.x.x and timout when version is 4.x.x ?

#107 – Ghost – Saturday, January 9th 2016, 15:53

Hi i followed the same step but when i run the command :
<ffmpeg -s 640x480 -f video4linux2 -i /dev/video0 -f mpeg1video \
-b 800k -r 30 example.com:8082/yourpassword/640/480/>

i get this error :
<ffmpeg version N-77776-g0948e0f Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.8 (Raspbian 4.8.2-21~rpi3rpi1)
configuration: --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-nonfree
libavutil 55. 13.100 / 55. 13.100
libavcodec 57. 22.100 / 57. 22.100
libavformat 57. 21.101 / 57. 21.101
libavdevice 57. 0.100 / 57. 0.100
libavfilter 6. 23.100 / 6. 23.100
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
[video4linux2,v4l2 @ 0x3232230] The device does not support the streaming I/O method.
/dev/video0: Function not implemented
>
help!

#108 – akash – Thursday, February 11th 2016, 14:25

@ghost: check using

ls /dev/video*

to detect input video sources.

@ laurent : i am also facing same issue of av_interleaved_write_frame: broken pipe, when trying to run on windows and raspberry pi. Did you find any solution?

 

#109 – calvin – Saturday, February 13th 2016, 20:27

@akash, @ laurent
The nodejs http connetct default timeout is 2 minutes.
<a>nodejs.org/api/http.html#http_server_timeout<a>
Set to 0 to disable timeout.

#110 – Miguel – Monday, March 7th 2016, 13:28

when i access the stream on a browser i get the message "upgrade required". what should i do? also whats the best command of ffmpeg to use on windows?

using this: ffmpeg -s 640x480 -f dshow -i video="USB2.0 HD UVC WebCam" -f mpeg1video \
-b 800k -r 30 172.0.0.1:8082/111/640/480/

and says the buffer is too full

#111 – Magnus – Wednesday, March 23rd 2016, 02:04

I found the solution to the 120 seconds timeout issue. The timeout issue seem to be tied to Node.js v4.x.x and newer. In order to amend the timeout problem, I added this vicious and elusive line of code to "stream-server.js":

response.connection.setTimeout(0);

The file, "stream-server.js" goes like this after the correction:
 

if (process.argv.length < 3) {
    console.log(
        'Usage: \n' +
        'node stream-server.js <secret> [<stream-port> <websocket-port>]'
    );
    process.exit();
}

var STREAM_SECRET = process.argv[2],
    STREAM_PORT = process.argv[3] || 8082,
    WEBSOCKET_PORT = process.argv[4] || 8084,
    STREAM_MAGIC_BYTES = 'jsmp'; // Must be 4 bytes

var width = 320,
    height = 240;

// Websocket Server
var socketServer = new(require('ws').Server)({
    port: WEBSOCKET_PORT
});
socketServer.on('connection', function(socket) {
    // Send magic bytes and video size to the newly connected socket
    // struct { char magic[4]; unsigned short width, height;}
    var streamHeader = new Buffer(8);
    streamHeader.write(STREAM_MAGIC_BYTES);
    streamHeader.writeUInt16BE(width, 4);
    streamHeader.writeUInt16BE(height, 6);
    socket.send(streamHeader, {
        binary: true
    });

    console.log('New WebSocket Connection (' + socketServer.clients.length + ' total)');

    socket.on('close', function(code, message) {
        console.log('Disconnected WebSocket (' + socketServer.clients.length + ' total)');
    });
});

socketServer.broadcast = function(data, opts) {
    for (var i in this.clients) {
        if (this.clients[i].readyState == 1) {
            this.clients[i].send(data, opts);
        } else {
            console.log('Error: Client (' + i + ') not connected.');
        }
    }
};


// HTTP Server to accept incomming MPEG Stream
var streamServer = require('http').createServer(function(request, response) {
    var params = request.url.substr(1).split('/');
    response.connection.setTimeout(0);
    if (params[0] == STREAM_SECRET) {
        width = (params[1] || 320) | 0;
        height = (params[2] || 240) | 0;

        console.log(
            'Stream Connected: ' + request.socket.remoteAddress +
            ':' + request.socket.remotePort + ' size: ' + width + 'x' + height
        );
        request.on('data', function(data) {
            socketServer.broadcast(data, {
                binary: true
            });
        });
    } else {
        console.log(
            'Failed Stream Connection: ' + request.socket.remoteAddress +
            request.socket.remotePort + ' - wrong secret.'
        );
        response.end();
    }
}).listen(STREAM_PORT);

console.log('Listening for MPEG Stream on http://127.0.0.1:' + STREAM_PORT + '/<secret>/<width>/<height>');
console.log('Awaiting WebSocket connections on ws://127.0.0.1:' + WEBSOCKET_PORT + '/');

 

#112 – IMBM – Thursday, June 2nd 2016, 16:48

Thank you Magnus. This is really the solution I am looking for.

#113 – Charles E – Monday, September 26th 2016, 07:10

I'm having an issue with your decoder here is the error in question. The "quantMatrix" was null. I will say that I'm using gdigrab and I'm on windows machine. Any help would be greatly appreciated.
 

// Dequantize, oddify, clip
		level <<= 1;
		if( !this.macroblockIntra ) {
			level += (level < 0 ? -1 : 1);
		}
		level = (level * this.quantizerScale * quantMatrix[dezigZagged]) >> 4;
		if( (level & 1) === 0 ) {
			level -= level > 0 ? 1 : -1;
		}
		if( level > 2047 ) {
			level = 2047;
		}
		else if( level < -2048 ) {
			level = -2048;
		}

 

#114 – Bozon33 – Tuesday, October 4th 2016, 21:23

Your solution is really a step back, not a step forward.
HTML5 Media Source Extensions is the future.
You don't have to use MPEG-DASH to stream to HTML5 Media Source Extensions; you can happily use websockets for that and send small enough segments to achieve low latency. As of Oct 1 2016 there are two streaming servers that do that: EvoStream server and Unreal Media Server.

#115 – Getting the Frame – Monday, October 10th 2016, 16:57

Is it possible to get a frame from the stream, if yes, how could i do. supposing that i'm in the Client side not the server Side. Thank you :)

#116 – what's wrong???? – Thursday, October 27th 2016, 09:33

root@hf:~/node_ws# node stream-server.js 123456 9000 8010
Listening for MPEG Stream on 127.0.0.1:9000/<secret>/<width>/<height>
Awaiting WebSocket connections on ws://127.0.0.1:8010/
Stream Connected: 172.18.12.207:54195 size: 320x240
New WebSocket Connection (1 total)

/root/node_ws/node_modules/ws/lib/PerMessageDeflate.js:309
var data = Buffer.concat(buffers);
^
TypeError: Object function Buffer(subject, encoding, offset) {
if (!(this instanceof Buffer)) {
return new Buffer(subject, encoding, offset);
}

var type;

// Are we slicing?
if (typeof offset === 'number') {
this.length = coerce(encoding);
this.parent = subject;
this.offset = offset;
} else {
// Find the length
switch (type = typeof subject) {
case 'number':
this.length = coerce(subject);
break;

case 'string':
this.length = Buffer.byteLength(subject, encoding);
break;

case 'object': // Assume object is an array
this.length = coerce(subject.length);
break;

default:
throw new Error('First argument needs to be a number, ' +
'array or string.');
}

if (this.length > Buffer.poolSize) {
// Big buffer, just alloc one.
this.parent = new SlowBuffer(this.length);
this.offset = 0;

} else {
// Small buffer.
if (!pool || pool.length - pool.used < this.length) allocPool();
this.parent = pool;
this.offset = pool.used;
pool.used += this.length;
}

// Treat array-ish objects as a byte array.
if (isArrayIsh(subject)) {
for (var i = 0; i < this.length; i++) {
this.parent[i + this.offset] = subject[i];
}
} else if (type == 'string') {
// We are a string
this.length = this.write(subject, 0, encoding);
}
}

SlowBuffer.makeFastBuffer(this.parent, this, this.offset, this.length);
} has no method 'concat'
at /root/node_ws/node_modules/ws/lib/PerMessageDeflate.js:309:23
at DeflateRaw.callback (zlib.js:404:13)
root@hf:~/node_ws# ls
node_modules stream-server.js

#117 – eridem – Saturday, November 5th 2016, 12:41

For Windows, you can use DirectShow to connect your webcam:

A. Find list of devices:
 

c:\> ffmpeg.exe -list_devices true -f dshow -i dummy

ffmpeg version N-45279-g6b86dd5... --enable-runtime-cpudetect
  libavutil      51. 74.100 / 51. 74.100
  libavcodec     54. 65.100 / 54. 65.100
  libavformat    54. 31.100 / 54. 31.100
  libavdevice    54.  3.100 / 54.  3.100
  libavfilter     3. 19.102 /  3. 19.102
  libswscale      2.  1.101 /  2.  1.101
  libswresample   0. 16.100 /  0. 16.100
[dshow @ 03ACF580] DirectShow video devices
[dshow @ 03ACF580]  "Integrated Camera"
[dshow @ 03ACF580]  "screen-capture-recorder"
[dshow @ 03ACF580] DirectShow audio devices
[dshow @ 03ACF580]  "Internal Microphone (Conexant 2"
[dshow @ 03ACF580]  "virtual-audio-capturer"
dummy: Immediate exit requested



B. Then, pick it up one of the inputs. Like in your example:
 

c:\> ffmpeg.exe -f dshow -i video="Integrated Camera" -f mpeg1video -b 800k -r 30 http://127.0.0.1:8082/123456/640/640



Cheers!

Refs: trac.ffmpeg.org/wiki/DirectShow

 

#118 – Dave – Tuesday, November 8th 2016, 08:05

I would like to get this running with wss//

my website is running https, and im getting errors trying to load insecure websocket

#119 – Young Treant – Wednesday, November 9th 2016, 20:50

added audio, unfortunately can't share the code (it's not mine) though can give instructions 
change ffmpeg format
filter the audio and video from the mutex data
separate data to frames
send the frames timely to the library (this will play the video)
play the audio data

ffmpeg command should of course be changed from mpeg1video to mpeg.
(the reason the library doesn't work when you change it to mpeg is that using this format messages aren't sent frame by frame (live mpeg1video), but every couple of hundred of miliseconds - the code is counting on a frame or less per msg)
then you need to cut the frames from the mpeg (andrewduncan.net/mpeg/mpeg-1.html this might help, waiting for a packet with stream, taking the next <length size> data and add it to a "video buffer" [same for audio])
then you need to cut the next frame from this "video buffer" and send it every 1000/ frameRate (window.setInterval) to the library (as if it was sent from the nodejs)
to add the audio you'll have to add mp3 or pcm to the ffmpeg line, and cut it also to a clean "audio buffer". then you can play it (can use web audio which works also on mobiles)
to play low latency mp3 without having audio problems, you can use this low latency audio library : github.com/JoJoBond/3LAS
to sync you can use the PTS in the messages you filtered (can also just timed it by constant delay you find hoping it won't go out of sync)

#120 – Chris Roth – Monday, November 28th 2016, 01:49

Hey Dominic,

Thanks so much for the post. This was extremely helpful for a project that I've been working on where I'm streaming web pages rendered with PhantomJS into the browser: github.com/cjroth/aframe-phantomjs-continuous-streaming

Chris

#121 – hkalexling – Tuesday, December 27th 2016, 18:33

Thanks for the great post. 

In case anyone needed, the following command works for me on OSX webcam:
 

ffmpeg -s 640x480 -framerate 30 -f avfoundation -i "0" -f mpeg1video -b 800k http://example.com:8082/secret/640/480

 

#122 – DRB – Friday, January 6th 2017, 08:16

I wanted to make a project where clients will stream live from their webcams. This solution works! So thanks for that! 

But clients have to have ffmpeg installed in their system and should run the ffmpeg commands directly from the command prompt. How to change that? 

My requirement is that the clients will just stream the videos from their browsers. How can I do that? Any suggestions will be helpful.

#123 – Lena – Monday, February 6th 2017, 12:37

lol

#124 – Lena – Monday, February 6th 2017, 12:37

Very well

#125 – quest – Friday, February 24th 2017, 10:49

The iOS App doesnt work as the link provided by the app does not show the webpage and the browser says took too long to respond. Please update the app

#126 – Jovi Yu – Wednesday, May 3rd 2017, 08:55

The iOS app does not work on my iphone5s. I started the app and typed the ip address of my phone. But the browser could not get anything. Link failed. Please make the app updated

#127 – Dominic – Friday, May 5th 2017, 20:21

@Jovi Yu the port was probably blocked by another app. Try restarting your phone. Will fix this in the next update!

#128 – ibraheem – Wednesday, August 9th 2017, 13:09

hi can any one tell how to configure audio with video using ffmpeg??

 

Post a Comment:

Comment: (Required)

转载于:https://my.oschina.net/yonghan/blog/1552272

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值