To add to Steve's comments, one key benefit beyond high power efficiency of using an ARM server is the fact that you can run ARM Android binaries natively on ARM based servers. As Steve says latency is quite low and ideally suited for network operators since latency is something they have some control over vs. OTT companies.
App streaming as used here does mean running the app on the server, taking the frame buffer output, H.264 encoding it and sending the "app stream" to the client where it is decoded. User events (keys, touch, game controller) flow back from the client to the server. As far as latency, in an key driven app (i.e. a STB remote control), the app seems local when latency is <125ms and for a 3D game app, latency should be <75ms. In the Netzyn implementation, basically the only contributor to latency is the network and both of these latency requirements are easily met for this type of used case with servers in just one or two data centers. Also the H.264 frames are only sent when there are changes, so average bandwidth for a STB Guide app is generally <100Kbs, but will have instantaneous peaks of a few Mbits.
Steve
Netzyn, Inc
HI, i'm wondering if the demo is really based on video streaming or if it relies on what Netzyn calls Streaming OS Architecture where OS is streamed and executed on client side. If it is really based on video streaming I'm wondering how latency is hidden and how video quality is so good. To me, encoding 1080p screen at 60fps using H.264 cost a lot and streamed over ip bring huge latency. Can you provide answers ?
thanks in advance for your feedback.
Sebastien