Streaming Media East

For once you’ll be getting some updates more regular to the blog here on Geekteq the coming week. This is mainly due to the fact that I’m heading to New York City next week to visit Streaming Media East. I’ll be posting, hopefully, on a daily basis with some news and reviews on what I’ve heard during the day.

My plan right now is to join these two pre-conference sessions – which I find most interesting of the four I can choose from:

Encoding for Multiscreen Delivery

Learn how to create a set of video files that will play on all devices, from smartphones to computers and OTT devices. The class starts by exploring key concepts like protocol and container format and technologies like HTTP Live Streaming, DASH, and Dynamic Streaming. Then it moves to a technical overview of the H.264 specification to identify those configuration parameters that impact quality and those that don’t, and how they affect playback compatibility. Then we’ll review the technical requirements for single and multiple file delivery to Flash, HTML5, iOS, Android, Windows Phones, Windows 8, and the Apple TV, Boxee, Roku, and other OTT devices. Along the way, you’ll learn the current encoding and delivery practices used by high-profile broadcast and corporate sites to help refine your technology decisions. You’ll walk way knowing the technical requirements for delivering to all key platforms and an understanding how to do so.

and then in the afternoon it’s time for (I’m using Wowza intensively in my projects so it fits rather good wouldn’t you say?)

Wowza Media Server and End-to-End Workflows

There is a confusing array of products and solutions for streaming video to end users, but few that simply deliver your content to any device. If you’re serious about video, you’ve been seeing and hearing more references to Wowza Media Server. In this session, you will learn about the end-to-end deployment workflows that you can build using this unified streaming media server software. You’ll see live demos and find out about the newest Wowza functionality. Discover how Wowza works with cloud delivery and stacks up against other media delivery options. Finally, gain insights about emerging technologies (such as MPEG-DASH, H.265, and HTML5) and how to future-proof your streaming media deployment.

For the actual conference sessions during tuesday and wednesday I haven’t set anything in stone – there are some of them that sort of clashes – I’d like to see both but can only choose one… *bummer*

But.. in short – I think it will be these:

Tuesday

Encoding Video for iDevices

This session starts by detailing the playback specs for all iDevices, old and new. Then you’ll learn the strategies used by prominent iTunes publishers to serve the complete range of installed iDevices. Next, the seminar switches to cellular wireless delivery, with a technical description of Apple’s HTTP Live Streaming (HLS), including recommendations for the number of streams and Apple’s encoding parameters. You’ll walk away knowing how to encode for both iTunes and mobile delivery to iOS and compatible devices.

It’s not as if I don’t already know about how do to do this, but I might have missed something. This is one of the sessions I might change because it clashes with another one 🙂 The three sessions below is a must for me..

Content Preparation And Transcoding For Multiscreen Delivery

With the introduction of adaptive streaming formats, a growing number of IP-enabled streaming boxes, and the proliferation of handheld devices, content owners face increasing challenges for multi-screen content preparation. A key part of that content preparation is the encoding of content for each device, the algorithm choices/trade-offs, codec settings, and the particular requirements of various distribution platforms. This session will analyze the key components of file-based transcoding and will talk practically about converting content for multi-screen delivery.

Understanding the Significance of HEVC/H.265

The most recent video compression standard, HEVC / H.265, was placed into final draft for ratification earlier this year and is expected to become the video standard of choice over the next decade. As with each generation of video compression technology before it, H.265 promises to reduce the overall cost of delivering and storing video assets while maintaining or increasing the quality of experience delivered to the viewer. This session will address what H.265 is, how it differs from previous generations of compression technology including H.264, key barriers to widespread adoption, and thoughts on when H.265 is likely to be implemented.

MPEG-DASH: The Next Steps Towards Broad Adoption

Members of the DASH Industry Forum will discuss what concrete steps have been taken in order to foster fast adoption of the new industry standard for adaptive streaming over HTTP. The session will discuss the recently published DASH264 Implementation Guidelines that cover both live and on-demand services, MPEG-DASH profiles, audio and video codecs, closed-caption formatting and common encryption constants. The panel will consist of representatives from all relevant parts of the ecosystem and will address the practical matters and relevance of MPEG-DASH based service roll-outs for streaming and hybrid broadcast applications.

Then we end the day with a battle again. It would either be…

Evaluating the Effectiveness Of Your H.264 Encoder

Not all video encoders are created equal. In this session, the real-world video outputs of top commercial H.264 encoders are compared, including those from Telestream, Harmonic, Sorenson, and Adobe, as well as open-source options such as FFmpeg and x264. Learn what features you should have available in an encoding tool before you invest your organization’s budget in a solution.

or…

Battle Of The $99 Streaming Boxes

With so many streaming devices in the market, trying to determine what each one offers in the way of streaming quality and content inventory can be quite confusing. In this special session, Dan Rayburn will present hands-on demos showcasing the leading streaming devices, including those from Apple, Roku, Boxee, Western Digital, Sony, Vizio and Netgear. Attendees will see these devices in action, learn which content platforms they run, and have a chance to ask questions.

I’m sort off leaning towards the latter since I can, mostly, decide which H.264 encoders that works better or worse already – but I might learn something new. When it comes to streaming boxes I’m a bit blind (there are too many) and it could be good to see some of them.

Wednesday

The first session is a no-brainer for me…

Building a DASH264 Client

With all the device fragmentation in the market, it is getting increasingly difficult to provide content to all of them equally. The MPEG-DASH specification promises to unify the field and provide a ubiquitous format that can be used by most devices. This technical session explores how to build a DASH264 player. We will explore a few different players, including one built-in JavaScript using the MediaSource APIs to run natively in some browsers, and another using OSMF and ActionScript that can run in any browser with a Flash player.

This one is sort off a best of the worst at the time – I’ve done so many livestreaming cases that I could probably host it myself (kidding) but it’s always good to get new input from others as well.

Best Practices For Live Streaming Delivery

This session provides best practices, lessons learned, and a general overview of the technical set-up for a professional live streaming production. Learn about transmission methods (IP, cellular, fiber, satellite), encoding on site or off, picking the proper encoder for the job (software vs. hardware), maximizing encoder & CDN efficiency, and delivering adaptive HD streaming to the desktop, mobile, and OTT boxes. Come learn how to improve your next live event.

Following that I’m thinking this could be interesting to hear, since I’ve got partners who was involved in that particular case..

How The BBC Ensured Live Streaming Resilience For The Olympics

Live video streams were key to the ambitious online user proposition for the London 2012 Olympics, and that coverage had to mirror the very high traditional broadcast standards of resilience and quality. Hear the challenges the BBC faced when designing a resilient HTTP streaming infrastructure that was designed to cope with huge volumes. Learn about the solution the BBC used during the games and hear what changes to their methodology was required to build resilience into a cloud-based infrastructure.

The last part of the conference seems to be winding down sessions and I can’t for the life of me select which of them to go at this moment. None of them interest me more than the other. We’ll see where I end up.

Stay tuned – more to come as the conference progresses.

Broadcast Live to Wowza from iOS and Android using myCast

This will be a relatively short post even though it’s the first since forever – sorry about that, hectic professional life will do that to you.

Something that I’ve been looking for, for ages it seems, is the possibility to broadcast from a mobile device to Wowza. Basically turning my iPhone or Android-phone into a camera and broadcast that to a Wowzaserver that I manage on my own.

There are tons and tons of apps that do broadcast from iOS & Android devices to specific services, but there hasn’t been one that gives me the opportunity to use my setup on my domain. Until now!

Enter myCast. This little $2 app makes your iOS and Android device a full-fledged mobile livebroadcasting camera.

I say this with a wink. You can’t really expect to get HD-quality out of it (yet), and I’d certainly recommend using some sort of headmounted microphone/headset if you’re going to be using it for something other than fun stuff.

Here’s what the developer has to say about the product

(in the event that I don’t agree my comments will be in italics and another color)

myCast allows you to use your iPhone or iPad as a mobile streaming video camera. Used with Flash Media Server or Wowza Media Server, you can stream and record* video anywhere you have a 3G mobile phone reception or WiFi network connection.

Videos are streamed using the FLV media container to give low-bandwidth capabilities, ideally suited for mobile bandwidth restrictions. You have control over the host, application and stream names, you can even specify a username and password where you want to authenticate the source of a stream on your media server.
(someone correct me if I’m wrong, I often am, but can – such as – iPhone see the FLV-container using the builtin http-live-streaming protocol in Wowza? I think not!? I’ll have to check this out and comment later – since Wowza can do many converts on the fly it might be possible)

During live streams you maintain control over the resolution, and frames per second of the live stream, with realtime feedback provided via a simple traffic light colour coded wireless icon, allowing you to know how reliably your stream is being sent. Additionally, you can switch between front and rear cameras as well as mute the audio. All these settings are available during live streaming, leaving you with complete control.

Features
★ Multi-Resolution Support (640 x 480 / 320 x 240 / 160 x 120)
★ Front & Rear Camera switching where supported
★ Adjust Frames Per Second
★ Mute to prevent audio publish
★ Publish & Play (a single handset can only publish or receive at any one time, multiple handsets, or a Flash player in a web browser will be required to view the stream without an extra handset).

* Recording of a stream is managed by the media server configuration.

Notes from the developer
– This application is intended for users who already have access to a streaming media server that supports the FLV container. There is no public streaming service provided by the developer.
– This application is Universal, working on the iPad as well as the iPhone. However, if using the original iPad you will only be able to use it to view streams due to the lack of camera.
– Android handsets that do not support continuous autofocus, will be missing that feature in this application. For example the Samsung Galaxy SII forward facing camera supports continuous autofocus, but the rear camera does not.

Conclusion

For $2 this app is a good thing. I’m willing to pay $20 for the “professional” version. Although, I don’t know what I’m missing now, I’m sure that I can figure something out that could be a good “professional” feature. The sales people at work thinks that this could be a good thing to use to sell live-broadcasts from people who are constantly on the move and do not mind providing a “lower” quality output.

(the image/screenshot is from the developer so copyright Codeghost LTD for that)

DRM is pretty much useless

I know, I’m kicking the hornets nest here. There’s no doubt about it. Before we continue I want to make it clear that I’m basically only talking about streaming in this post and it does not relate to downloadble content, this because I don’t work with that – yet.

The base of my decision on that no DRM will ever prevent me from being able to copy content if I really want to copy it. Therefor it’s useless.

Yes, I’m not a fan of DRM and mainly because it adds load to servers and makes it more or less impossible to make content available regardless of platform. Today, if I’m not going to use very expensive DRM-systems I’d have to use about 3-5 different servers to serve up content for different platforms and at least the same amount of DRM-systems. At least if I’m not going to use some proprietary client that costs an arm and a leg to make and distribute.

Usually, when I a client comes to me and says ‘what about DRM? I want my content to be safe!’, I describe the problem and solution like this:

‘If you want DRM you can have it. But….’

And then I explain to the client the cost with it and continue to tell him that I can, if I get to the content legally, without any big problems copy the movie and there’s nothing he or me can do about it. As long as someone, legally, has access to the content – such as a viewer who’s paid his subscription or the onetime fee to watch a clip – he/she can copy the content. As long as I can see the content on the screen and hear the content on my soundcard I can copy it. In my case I can copy it in HD-format since my computer can play that without much hassle. So I get quite good quality, and I can do it semi-legally as long as I’ve paid to see it once (or get it for free). But of course, since I copy it it’s illegal but nothing I do can’t stop it being done.

There are of course up and ‘coming’ techniques such as UltraViolet but even that has gotten a lot of grief from users about quality and if I had anywhere to test it, I could probably still get around the ‘protection’ and dump it to disk. The costs are for me, as of yet, either non-existant (as in I don’t know) or the sites I’ve seen that seems to provide some kind of middleware then we’re talking loads of money going away from me and heading for the middleware-makers… some of them made hilarious pricingtables, or what would you say about $200.000/year (and oh, you have to sign up for 5 years)… good luck with that…

Cheaper and simpler? Watermarking. Either the old version with burning in a logo, that of course can be covered in almost any videoediting software, or still untested by me doing it digitally in the background so you can see who spread the content. The latter seems most interesting to me so I can see who the culprit is but still it’s a hassle.

So what do I suggest instead? Nothing! Huh? The short answer is price your content in a way that makes the hassle of ripping it and distributing it illegally less interesting.. That’s my three cents after tax. I’d like to hear what you think. And yes, you can call me an idiot if it makes you feel better – but it won’t make me change my opinion about it.

VLC and Wowza sitting in a tree…

…. K-I-S-S-I-N-G.. 🙂

So as I’ve told you before, it’s been a busy few months and a lot has been going on. Currently working on a project for a client where we need to capture Multicasted IPTV Header, transcode it and then broadcast it to mobile terminals (iPhone’s, Android’s and older terminals as well – 3gpp).

It’s been an interesting time trying to figure out how to get VLC to jump through hoops and make it work together with Wowza, without getting a lot of qualityloss on the way.
During the testingphase of the project, about the same time as I was testing Wowza Media Server 3 Preview, we tried out the transcoding addon from Wowza. It might have gotten better in handling stuff since the preview I did but since we planned on being done BEFORE the release of Wowza Media Server 3 – we decided against using it.

Enter VLC Media Player

My readers from before will know that I’m sort of a *nix-nerd and as such I ‘refuse’ to use other operatingsystems if I can help it. So the operating systems on the below setup is CentOS 6 straight through. And after a few hours, more like a day, to get VLC to compile into the version I needed –  I actually gave up and went out hunting. And found, precompiled, versions of what I needed out there on the net. Installed and whammo.

Running VLC from commandline with a ‘cfg’-file is an interesting thing. Documentation, that I could find, are scarse. I’ll give you an example below of how I configured one of the channels, and told VLC to pick up the multicasted ts-stream and then send it on to Wowza.
[pre_code]
new channel1 broadcast enabled
setup channel1 option program=480
setup channel1 input “udp://@127.0.0.1:1112″
setup channel1 output #duplicate{dst=”transcode{venc=x264
{keyint=60,profile=baseline,level=3.0,nocabac},vfilter=
canvas{width=480,height=270,aspect=16:9},vcodec=x264,
vb=416,scale=1.0,acodec=mp4a,ab=64,channels=1,samplerate=
44100}”,select=”program=480″}:duplicate{dst=rtp{mux=ts,
dst=172.16.1.2,port=40010,ttl=3},select=”program=480″}
control channel1 play
[/pre_code]

So what am I doing? In short… setting up a new channel, telling the channel that multicast programID we’re looking for is 480 and that the multicast is at localhost UDP port 1112. The output should be double, in effect, I want to transcode it and then send the transcoded material on to another machine and a certain port on that machine. The I tell it to start doing it’s job. Then rinse and repeat, in this case, 7 times with double encodings on both.. so the above, 14 times per encoder.

If we put it mildly, the processing-power of these machines are nice. Dual processors with 12 cores each and 24 gigs of ram. Still.. the VLC-process gulps up about 600% of the CPU.. which looks weirder than it is.. the load? about 3 when humming along…

Wowza backend configuration

The “backend”-Wowza server is roughly half in size and processingpower than the encoders (and the frontend wowza). This is mainly due to the fact that it doesn’t get very much to do. It receives all the encoded material on it’s different ports. I’ve set it up as a liveorigin, this is mainly because I want the frontend to be scalable in an easy fashion. Adding more hardware and frontend streamingservers is quite easy if you use Wowza’s own loadbalancing solution for it. That’s the main reason why I set up origin-edge from scratch. With the start of the service I could probably have survived most of the traffic on this machine alone – but client wants scalable, client gets scalable. The backend-wowza is ONLY available on the internal network in this setup. The only port available to the outside is the streammanager, so we can reset streams if needed – without restart Wowza completely.

Wowza frontend configuration

A workhorse to say the least. Expandable hardware is nice. It’s basically a 2U version of the encoders and have 24gigs of ram and the same processor and core-count. I’ve even milked the support-personnel at Wowza for optimizing this setup so it gets as good as I can without overdoing it.

Example: I have 24gigs of ram. Wowza Support doesn’t recommend giving Wowza itself more than 8gig. Why? Because then Garbage Collection on the javaside starts taking too long and that hampers performance. But it’s basically doing a liveedge setup, times two since we’ve had to implement and old workaround (feature) in Wowza that makes it possible to change mp4 (android/iphone) to 3gpp (mp4latm) instead, for the ooooold terminals. How I wish that people would kick their old phones to the curb and get new ones. Would surely make my life a lot easier.

This machine is currently the only one available to the public.

Endpoint?

Well.. as usual, when it comes down to projects I’m involved in, the product isn’t available outside of Sweden and not outside of a certain cell network in Sweden. So I can’t show you anything worthwhile ‘live’ and direct. If you are interested in seeing the end product… what you can do is ask me and I can, perhaps, provide you with a temporary link that you can try out.

I can, however, show you a screencapture from my iPhone 4 (the old one, not the new and improved one)

And the only thing I’ve done is to hide the channel and the subtitling – I’m sure someone can decode what channel and program it is anyways but I’ve tried at least. The image actually looks better on my iPhone than it does in the “real world”.

If you have any questions in regards to this setup, don’t hesitate to ask – if the questions gets too intricate then we can discuss the consultingfees 🙂