I'm hearing a lot about "Flash HTTP Streaming" -- can someone shed some light on if JWPlayer supports this and why this is so important? Is this basically the same as Apple's HTTP Live Stream technology (HLS) except for Flash players?
I'm not referring to "pseudo-streaming", but rather this:
http://www.wowzamedia.com/forums/content.php?111
It is not pseudo-streaming, but rather something similar to Apple's HLS (HTTP Live Streaming) protocol and Microsoft's Smooth Streaming.
The link you sent in your post actually seems to make this distinction with this statement: "HTTP Pseudostreaming should not be confused with HTTP Dynamic Streaming. The latter is a brand-new mechanism soley supported by the Flash Plugin 10.1+ that works by chopping up the original video in so-called chunks of a few seconds each. The videoplayer seamlessly glues these chunks together again. The JW Player does not yet support HTTP Dynamic Streaming."
So, does that mean the JW Player does not (still) support this new HTTP Streaming as defined by the first link I pasted?
To answer your question, the JW Player only supports HTTP pseudo streaming. HTTP Live Streaming, HTTP Dynamic Streaming and HTTP Smooth Streaming (which, as you point out, are all variations on the same theme) are not yet supported by the JW Player.
We don’t have any announcements yet on HTTP live/adaptive streaming, although we are keeping an eye on how the market for HTTP streaming is shaping up.
@PabloS it would be really good to implement Flash HTTP Streaming (San Jose Streaming) provider. IMHO this is the future - we are developing videotechnology for mobile devices (live streaming, VOD) and the segmented streaming on Apple devices is what is the Flash lacking. - You have no restrictions (in general) for the packets since it's common HTTP (RTMP and port 1935 is usually blocked in structured networks and RTMPT isn't always reliable either - and it increases significantly the server load (we are using Wowza Stream server)), so it should be available in 99.999% cases. - You are loading just the chunks, which means no continuousload of streamed data (you will load the data once it's needed, using full download speed available and then just play it with no need to download continuously - perfect for both mobile devices and servers) - I think this is the right direction for the #1 player, which the JWP surely is (for now) - customers want and need it - The problem is that everyone is just waiting for the future market situation - it's good to be the first - like Apple... ;-)
So please, don't think the Flash HTTP streaming is a deadend..
Thank you for reconsidering implementing this media provider.
We have beta support for HTTP streaming. However, this is Apple HTTP Live, Not Adobe Zeri. In the end, this should make things easier for people since only one output format is needed, not two.
Here you can find the provider and documentation. Please let us know any feedback!
I have been playing with your adaptive branch for about a month now for a personal project. (and have just grabbed the latest copy from the above link).
I would just like to say: About time *someone* implemented (apples) HTTP Live Streaming (that actually works for viewing the streams) on a platform other than one maintained by Apple.
Not to mention the fact that jwplayer can do all of this along-side the Video-tag fallback is fantastic.
Unfortunately, for my personal project the majority of the video streams are AVC with Mp3 audio, instead of AVC with AAC audio. is the AAC audio limitation a limitation of the flash player somehow? or is it more that mp3 audio streams are less likely, and thus just not implemented.
I have downloaded your code for the adaptive provider, and am going to see if I can naively modify it to support mp3, but if this is impossible for some technical reason I will keep checking back here every now and then to see if any more comments have been posted.
In the mean time, thanks for your great work! Peter Avram
Excited to hear that there is some work being done on supporting Apple's HLS on JW player. My ideal world as a content producer is to only need to encode to single format. Since I want to support iPads- that format is HLS.
Please let me know if you want beta testers. I can help QA. -Eric
@Peter: MP3 can be supported, since it is a codec that is supported in Flash. Perhaps you can send us a small test stream (e.g. 2 qualitylevels, 1 minute)? Then I can look into it. No promises around timing, since jumping into the demuxing code again does require me to have a full spare day or two.
@Jeroen Great to hear! I have made a stream and sent it on to you.
I have waited several months for even basic support of this feature, I can wait knowing that *someone* has it on the agenda!
Adobe are pushing their own solution (proprietary flash media server) Microsoft are pushing Silverlight Smooth Streaming. Google want WebM to take off, and unfortunately apples spec requires AVC(h264) Apple strangely don't want their iPhone format playable on PC. They have quicktime X which would give the PC the capability (even if through a plugin for web-based video) but for some reason they haven't ported it to win32 yet.
In that list Apple really are the strangest, they are trying to push this technology, they (presumably) want it to take off (they like h264, AND their phone market). Win32 support for the streaming format would surely only help them. Also with all their crying about flash you would hope they would implement the format in at-least their Win32 port of safari. Sadly, they aren't thinking coherently enough, and are missing an opportunity to give Safari windows more relevance.
Adaptive streaming will probably remain to be a fragmented market for some time. We started doing some work on Apple HLS, since that’s the format that offers access to a unique userbase (iOS), both Smooth and Zeri provide access to the same desktop userbase for which, frankly, RTMP dynamic works fine today.
I received the MP3 streams and will do some tests with them!
Sorry for the newbie questions but I can't seem to get the JWPlayer to start playing a HLS playlist. I have a local tomcat server running and the js file looks like: bc.. <div id="playerdiv">Sorry, not FLASH or HTML5 compatible</div> <script type="text/javascript"> jwplayer("playerdiv").setup({ height: 360, width: 960, autostart: 'true', modes: [ { type: 'flash', src: '/VLCTest/jwplayer/player.swf', provider: '/VLCTest/jwplayer/adaptiveProvider.swf', file: '/VLCTest/httpstreaming/playlist.m3u8' }, { type: 'html5', file: '/VLCTest/httpstreaming/playlist.m3u8' } ] }); </script>
I see the player get initialized and I can even get it to use different skins, but it never starts playing. The timeline bar always shows 00:00 as end time. If I point an app like VLC to the same http location it does see the playlist file.
Do you have any suggestions on how to debug this? There were no errors in Firebug.
OK, so I modified my script closer to the test code (and further from what is listed in the doc), and it now tries to load the file. The seek bar appears to get the length of videos in the playlist, but I get the following error: [IOErrorEvent type="ioError" bubbles=false cancelable=false eventPhase=2 text="Error #2032"]
The following is my current script bc.. <!-- START OF THE PLAYER EMBEDDING -->
<div id="playerdiv">Sorry, not FLASH compatible</div>
I have tried the adaptiveProvider you created, but also no luck yet. I see the .m3u8 gets loaded from the webserver, as well as the first media fragment, but after that the same error as ccrotty:
I am *very* interested in getting HLS to work inside a flash container, please let me know if there is any more info I can provide to help diagnose this.
Ok, problem solved already. It seems that the URL's in the m3u8 file are always taken as relative paths, even if the URL starts with http://.
Things seem ok now, the progress bar shows the right total time, and the proper media files are loaded and played when seeking. The only problem I still have now is that a few seconds into each fragment (3 to 6 seconds, it seems), playing stops and the last frame is displayed. I guess this might by caused by the way I encoded the video, I'll investigate this a bit more.
This is AWESOME! We've been waiting for HLS to be supported on JWPlayer since it was released. Now that Android is supporting HLS on Honeycomb this will quickly become a standard. Any target release dates? Thanks again for the hard effort in supporting this important protocol.
HLS support seems to be pretty much functional, great stuff!
There is one more issue I run into to get things working though: my video fragments are played for only a few seconds (typically between 3 and 6) after which video playing stalls, sometimes with stuttering audio, or the message 'No AAC audio or AVC video stream found'
Some searching only directs me to the JWplayer source, this message seems to originate in the MPEG transport stream demuxer:
if (_videoPES.length == 0 || _audioPES.length == 0) throw new Error("No AAC audio or AVC video stream found.");
Should I be looking for errors in my media files, or could this still be an issue in JWplayer ?
@Ico: can you put a small example live, so I can take a look? For example a one-minute clip? You can email the example; jeroen [at] longtailvideo.
@ccrotty: do you have the same issue, absolute URLs instead of relative one?
@Dan: no fixed release date yet, but we’re working at making the provider as stable as possible at present. If you use a service like Wowza or have segmented your files with Apple’s filesegmenter, the provider should be production-ready.
The provider indeed supports live streaming. The M3U8 file needs to hint this though, by not having the EXT-X-ENDLIST line in there. If the provider does not see that line, it jumps to “live” mode, doing two things:
1. Re-fetching the manifest
2. Not displaying the timescrubber (reflecting that it’s a live event)
The provider should be capable of playing video-only streams, although that’s not tested. When this is a TS stream with only an H264 channel, things should work.
I see the M3U8.isLive() fucntion that checks for the EXT-X-ENDLIST, but running, it does not appear to re-read the playlist.
I have the following playlist defined: bc.. #EXTM3U #EXT-X-TARGETDURATION:10 #EXT-X-MEDIA-SEQUENCE:0 #EXTINF:10, no desc fileSequence0.ts #EXTINF:10, no desc fileSequence1.ts #EXTINF:10, no desc fileSequence2.ts
I can start the jwplayer playing, but if I go back and add new chunks, they player still stops after playing fileSequence2.ts.
BTW, the timeline was still showing, although it did not have a end time listed.
The stream is OK, do you change the playlist fast enough? Also, you should remove old entries when adding new ones. HLS live is using the principle of a “sliding window”.
Note the “trailers” example on this test page is a live stream:
One more question, I'd rather ask it here so the answer becomes public: we still have the issue that JWplayer assumes all URL's in the playlist are relative to the player itself, so it is not possible to use absolute URL's from another server. Is this behaviour as designed ?
FYI, Once I added the #EXT-X-ALLOW-CACHE:NO line to the playlist file, then it seemed to reread the playlist as expected.
One more question, just to verify: The provider does not support the ANSI version of the playlist file (.m3u), just the UTF-8 encoded version(.m3u8).
I tried playing an ANSI encoded file and got the following error: "The video could not be loaded, either because the server or network failed or because the format is not supported:"
We have some transcoded video that does not contain any audio. When we try to play it in the player we get the following error: "No AAC audio or AVC video stream found."
Is an audio stream required, or is there a config to say ignore it?
This playlist seems to play ok in the VLC player (with it's stuttering and stopping between chunks of course).
Regarding not playing a video only stream: I see the issue where the TS.as file is looking for both video and audio or throwing an error.
I am trying to compile the source and am running into several of the following errors:
bc.. C:\Subversion\jwplayer\adaptive\src\com\longtailvideo\adaptive\streaming\Buffer.as(90): col: 29 Error: Call to a possibly undefined method appendBytes through a reference with static type flash.net:NetStream. _stream.appendBytes(_encodeTag(_buffer[_tag]));
I'm assuming I have an old version of some package. I am runing FLEX version 3.6. Do I need to update this?
I've tested the adaptiveProvider for several days, and although VoD HLS works great, Live HLS is always pixelated. I have confirmed that this is not related to the source feed or the encoder/segmenter, since the same streams play fine on different iOS devices. I have also confirmed it's not related to stretching/resizing, since jwplayer is being created with the same dimensions as the source.
Has anyone experienced the same? Do you have any suggestions on possible actions to eliminate or at least minimize this pixelation?
Our h.264 stream is generated by a hardware codec which generates new SPS's (followed by an IDR) when it finds this suitable. This can happen quite often, up to a few times per second when the image is changing a lot.
However, we try to segment the MPEG stream into pieces of approximately the same length (10 seconds), thus we end up with fragments containing more then one SPS/PPS per file.
In the JWplayer 'adaptive' spec I found the following section:
"TS fragments should each start with a PES containing SPS/PPS data and a keyframe slice (NAL unit 5). Fragments starting without SPS/PPS and/or with an interframe (NAL unit 1) will work for continous playback, but will break the player if the fragment is the first fragment after a seek or quality level switch"
This does however not say how JWplayer should react to fragments containing more then one SPS/PPS occurances. From the code I understand that only the first NAL is searched for SPS/PPS data, which is then extracted and copied into the avcc tag. Any additional SPS's will still be available as h.264 in-band data, but weill not end up in the avcc tag.
What is the expected behaviour of JWplayer when multiple SPS/PPS's are available in a segment? If it is to be expected that proper decoding is not possible without all the SPS/PPS's beging available in the avcc tag ? If so, would the solution then be to find all SPS's from the whole file instead of only the first NAL, and copy those all into the avcc header?
Do you have any news about a possible fix for this provider? A while ago we reported it wasn't working well for live HLS feeds, and we provided some reference examples via private message.
I believe we had these cleared out. Last week, I fixed some issues around relative links and audio-only files.
I might have missed your issue though. I got bug reports from 3 or 4 developers, and might have missed one. If that’s the case, can you re-connect with me over email (jeroen [at] longtailvideo)?
As mentioned by Jeroen, Posted: Wed, 2011-06-08 03:18 "The player reloads the playlist after every fragment play. This generally is every 10 seconds."
In a live streaming scenario, when a playlist is sent with a list of 10 second segments (assume 5 mins worth of video) without EXT-X-ENDLIST in the list. Will the player reload the list for every 10 sec segment even though it has a huge list to play?
Why not reload the playlist when the player reaches the end of the list?
I am trying to play an adaptive stream from my Wowza media server. The source is a stream encoded and sent by Wirecast, h.264 baseline, AAC audio. I am connecting to the Cupertino packetizer from the Wowza server. In my regular player, JW plays this stream fine in iOS.
In a test client I am trying to use Modes to play this same stream in a Flash player, via adaptiveProvider.swf (I grabbed the latest version, about two or so weeks old). The player loads, however I get a black screen on playback and buzzing audio in Flash, and a black screen/back to start in iOS. Clearly I've configured something incorrectly because if I omit Modes and just use the regular embedded method the same .m3u8 stream plays fine in iOS.
Can anyone point to an error I've made below? It's a private stream, so I've replaced the actual URL in the code below with a placeholder. I have no problem sending the actual code to JW if needed to futher test. Thanks!!
Things do all look fine. The only thing I can see is that the player might be confused b/c you set provider=http for HTML5 (best to omit, since for HTML5 we only have one provider: videoElement).
If you cannot get it to work, indeed pls send me the actual URL at jeroen [at] longtailvideo.
If you do add timed metadata, can you add it in the form of a javascript callback that is passed in so that the timed metadata even calls the javascript function with the tag data?
Hello I've noticed the PPS/SPS parsing is failing to get width and height from the SPS/PPS of Highprofile H.264 video generated using FFmpeg and libX264.
When I create a mpeg-ts with these commands: "-vcodec libx264 -flags +loop+mv4 -cmp 256 \ -partitions +parti4x4+parti8x8+partp4x4+partp8x8+partb8x8 \ -me_method hex -subq 7 -trellis 1 -refs 5 -bf 3 \ -flags2 +bpyramid+wpred+mixed_refs+dct8x8 -coder 1 -me_range 16 \ -g 25 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -qmin 10\ -qmax 51 -qdiff 4
The video I created is 320x240 but the getDimention function will return 16x80 which is not correct. Due to this the video can't be stretched using the uniform method.
"Once the first media file to play has been chosen, subsequent media files in the Playlist MUST be loaded in the order that they appear and played in the order that they are loaded.
The client SHOULD attempt to load media files in advance of when they will be required for uninterrupted playback to compensate for temporary variations in latency and throughput."
Jwplayer only loads 1 chunk, ant second chunk is loaded when first is played, so i get interupts between chunks on slow connections (1 Mbps stream and 2 Mbps connection) . It's possible to preload, for example 5 chunks in advance ?
It is possible to preload more chunks, but it will only result in less frequent but longer interrupts. Perhaps you can use a variant playlist to also offer a lower bitrate version of the stream?
Jeroen, thanks for providing the beta to play with! I am really excited to see you working on this!
I was testing and found a few issues I thought you should know about: <ul><li>B frames don't seem to be handled properly. I didn't extensively test this, but playing back the video had the jitter you expect when B frames aren't reordered properly. Also, switching from Main to Baseline fixed this.</li> <li>Non-standard frame rates (I was testing with 10 and 15fps) seem to cause the player to either throw error #1069, or just display a blank frame. Switching this back to 29.97fps fixed this.</li> <li>Same thing with audio sample rates, 32kHz causes the player to remain blank, while switching to 48kHz works fine.</li> <li>If the video has a non 1:1 SAR it will play at an incorrect size (if at all). This comes up, for example, transcoding NTSC video (which has a 40:33 SAR, and 4:3 DAR). Scaling the video so it has a 1:1 SAR fixes this.</li> <li>The player seems to be picky about what resolutions it supports. I have done almost no testing on this, I only noticed there were a couple that wouldn't load.</li></ul>
Keep up the good work, I know you'll have a quality product when it's ready for release.
Some stuff (like detection of pixel A/R) is on our radar (detection of dimensions is in there since a few months). Other stuff (like the B-frames) still needs investigation it seems.
Another issue - memory leak. It seems, that player saves every chunk in memory - after 1 hour of viewing browser consumes ~ 2 GB of RAM (if i stay on the same live channel) . Is it Jwplayer or Flash problem ? P.S. OS - Windows, Linux. Browser - All major browsers. Flash version - 11.0.1.152 or 10.3.x
The memory leak is our problem. The player indeed stores the full video in memory. We’ll fix that when moving to a production-grade version (set ist as a flag that’s disabled by default or so).
For now, it’s very useful for debugging the transmuxed video.
We don’t have a roadmap at this point. We’re first fixing a few bugs and investigating “if” HLS can be made production quality, or of we should add HDS support in Flash.
HLS test links are always welcome, if they can be made available permanently.
This is working pretty well for us. We did have a problem where our SAR / Pixel Aspect Ratio was 4:3 but video was 16:9, native Apple playback would be ok but the adaptive provider would be quite blocky. Changing SAR/PAR to 16:9 fixed that. Now it's working /almost/ perfectly for us, with the exception of some very brief pauses in playback, you can see here: http://wb.cumulusxp.com/wwe_superstars/ Any thoughts welcome
Still having that issue with (very brief) pauses, any suggestions? Suspect it's our encode, though have file almost exactly matches sample Big Buck Bunny used in the demo page. Settings: h264 (Baseline) ([27][0][0][0] / 0x001B), yuv420p, 720x480 [SAR 32:27 DAR 16:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
Have tried - lower bitrate - switch audio/video streams using ffmpeg -map - segment size
Would it be possible to get your ffmpeg / segmenter command lines? Also - which segmenter did you use? I found the newer one based off Carson's but updated for latest version of ffmpeg/libav worked well: http://code.google.com/p/httpsegmenter/
I'm also having an issue, where scene changes aren't working well: http://i.imgur.com/PUj0g.png The vid is baseline, 1 KF/sec. You can see Quicktime plays it back (.m3u8) fine, but not in flash. Looks to me something like, video data not getting into pipeline quickly enough?
From a brief glance, I recall seeing it converting M2TS to flv before putting in pipeline, which sounded like a bit of an inefficient hoop to have to jump through, not sure if that could be the cause, gonna have a look now ...
This is an example of the (worst-case) sort of blockiness, Adaptive Provider vs. Strobe with Adobe HDS. Again, native playback in Quicktime on OSX was fine; this does seem to be a problem only when using the adaptive provider. I could provide a sample of that clip if it would help, let me know where to email. http://i.imgur.com/PQOPJ.png
Time being I think in our case, we're unfortunately going to have to go OSMF / HDS.
This question has received the maximum number of answers.