09-13-2012 07:13 PM
I want to stream video from pc to native application, I have the bytes being streamed to my native application using udp socket.
What options does the ndk give me to display this video as its being streamed? I read somewhere that it may be possible to do this using ever expanding file but i cant find any reference to it anywherwe else and im fairly new to c programming.
Thanks for any information you can provide.
09-14-2012 12:07 PM
Depends on the video format. The device only supports certain formats, so it would have to be one of those. If you try to write to file, the format would have to be one that doesn't have the headers at the end of the file, otherwise it won't start. Theres the mmrenderer which takes in a path to a video, but I haven't used it successfully. It also only takes in a file path, and not raw data from a stream. The other option is try to use ffmpeg to decode the video and draw the frames manually (check out the HelloForeignWindow demo in github). I'm doing this is a live video stream app and the performance is just fine (I don't use audio yet, so I'm not sure what you'd do for that).
You can also wait til the next NDK release and see if something new comes out.
09-14-2012 01:03 PM
mreed you give me hope with your knowledge!
I hear you on ffmpeg but i think that would be out of my league right now compiling it to playbook and then even calling it. Although I have been using xuggler and ffmpeg (very basic) in recent weeks to create some server code which creates this video from images and sound. So that leaves me with one chance currently - mmrenderer .
I was trying mmrenderer samples last night but when adding the approprate includes and code in my .c file and trying to run on simulator i got program not an execuatble error even though all the right things were in place with add ibraries and editing the .pro file with LIBS += etc and it also built ok. I could not get past that error (many hours lost) I then read that this wont work on simulators and gave up. Can you shed any light on that error?
Also, i could't find any samples to write to an ever expanding file in c/ndk on the net - I don't know if that sounds like a stupid thing to say but i very new to native coming from java. Coud you give me a code hint on that please?
Your help is much appreciated.
09-14-2012 02:55 PM
I just popped over to talk to the mm-renderer guys.
You should be able to ask mm-renderer to open an http:// URL.
We apparently support http and http-live-streaming (.m3u8 ?)
Your PC-side app just needs to provide files via http.
Trying to do this with a fifo on the filesystem may also work, but you run the risk of hanging mm-renderer if a read would have to block.
09-14-2012 04:35 PM
Hi, thanks for the http information, i did notice it in the docs but I have already been able to stream the video to adobe air using http/tcp but the results for real time weren't good enough (7 sec delay - although initial 30 sec of stream was 1 sec), I spent days researching, trying different encoding attributes, buffer sizes, etc to come to the conclusion that it had to be udp which air doesnt support.
So I got my hands dirty in android thinking it was the perfect solution only to find that android only supports http progressive streaming also!
So now I attempting native udp streaming (i have the udp bytes) but getting nowhere fast as don't know how to write to an ever increasing file in ndk which the docs suggest. Then i will try to fix the error i currenty have with sample code of mmrenderer and simuator. and play this ever increasing file.
I just need a little help/link regarding the suggested writing and reading from this ever increasing file as im new to c. I think this will be useful to all to be able to feed bytes into mmrenderer, directly or indirectly using a temp file.
09-14-2012 04:57 PM - edited 09-14-2012 05:02 PM
Thats very interesting. I think we are doing very nearly the same thing. Although I'm writing my own server in C using tcp sockets and encoding the video with raw frames from the device. One thing I can mention about the real-time playback delay is that there is a buffer between when you write the bytes from the server, and when you read it on the device. The write isn't going to be blocking, so you can actually end up pushing more and more data through the pipe, but not be able to read it fast enough on the device. I found this out be waiting until the delay was long enough that I could stop sending data from the server, and see how long the device would keep reading and displaying new frames (it was about 11 seconds). Its basically just like a cascading one-off error.... you won't really notice it at first, but after about a minute or so the delay will just keep getting longer and longer. I'm doing a live stream from the device camera, so I can drop frames (dropping server side now, but I'll add sender device side later) and try to control the delay by pushing less data through the pipe. Its all still a work in progress though, but I've been on it for over a month so far.
09-14-2012 05:11 PM
as mentioned, you can try using http-live-streaming (which differs from straight http), or try creating a fifo in the filesystem and playing from it instead (beware of hanging mmrenderer though.. keep that pipe filled!)
09-14-2012 05:36 PM
Thanks Martin, I will look into fifo/http live streaming once I get past errors when trying to load app which contains sample video payback code.
mreed, what you say is indeed very interesting to me, the delay of 10 sec is amost identical to what i experienced when experiementing for days encoding my video. I noticed the main code which had an effect on this delay was increasing the socket send buffer size but as you say eventualy the delay seems ineviatble. I couldnt get anywhere else with it. I suspected it was a case of sending too much data rather than slow network because the first 20 seconds are super fast,
but what can you do to control this with tcp when encoding/sending data in a loop? I tried different sleep amounts, different buffer sizes, etc.but the deay is always there and constant which is even more weird.
Do you think udp is indeed the way to get to near real time video display on the playbook or could i maybe get it with my tcp version with a few tweaks? This is where I have hit a brick wall. I have also spent around a month on this!
09-14-2012 05:56 PM
If you go the route of using your own server setup with sockets to transport the video, I think it will take more than just tweaks. I have to manually edit the video as its passing through the server. I don't have it set up yet, but I'm also going to have to add an ack system to continually measure the delay and adjust my server side edits accordingly. Its not a small task.
I'm choosing the harder do-it-myself route. But if you choose the http live streaming, theres probably third-party solutions available somewhere you can use.
09-14-2012 06:17 PM
thanks mreed. Ive probably exhausted your time today already but my confusion is getting the better of me.
Am I understanding this right - the manual editing taking place server side you refer to is because the amount of data being captured in real time is too much to get to the client in near real time so you will drop frames (edit)?
Won't adding an ack system (would be interested in what you mean in more depth (so many questions!)) add more latency and work to the whole system - why not udp for you? (I noticed when trying to get to the bottom of my delay issue that wireshark showed a lot of duplicate ack packets which seems to cause pauses).
Is there a possibility then that http live streaming that martin referers to could overcome this large 8 - 10 sec delay (and is therefore different from normal http/tcp which i have alreasy tried).