= Dynamic Video Encoding = [[TOC(Other/Summer/2015*, depth=3)]] == Introduction == Live-streaming video from a device to the Internet in an efficient manner can be a tricky task. The concept of video-streaming itself is simple; the device records whatever is going on and then streams the video file information live to a video-hosting website. The problem arises with the nature of internet/data/WiFi connections. Having an unstable connection will result in a low, partially corrupted, and blurry video file on the streaming server. The original video file can always be uploaded later on to wherever the video was streamed to, but just simply re-uploading the same video file is a waste of bandwidth and time. Bits and pieces of the video file were already sent to the streaming server during the live-streaming process; why not use the information previously sent to figure out and upload whatever is missing? That's the goal and name of our project: Dynamic Video Encoding. == What We've Accomplished So Far == * Attempt ORBIT tutorials. * Establish WiFi connectivity between two nodes in the ORBIT network. * Simulate video corruption using a network emulator (netem) by transmitting videos from one node to another with varying degrees of packet loss. * Establish a working knowledge of Ruby. == Goals == * Automation of bash script procedures via Ruby scripts. * Dynamic adaptive streaming over HTTP (DASH). * Streamline DASH encoding * DASH video streaming tests with variable bitrate control == Background Information == === Anatomy of a Video File === 1. Container * Defined by file extension * Holds various audio/video streams * Specifies contained encoding scheme a. Encoding scheme defines algorithm(s) used to encode/decode video 2. Content stream * Contains encoded audio/video streams * Limits video use to only specific codecs === What is a CODEC? === 1. Encoder-Decoder * Algorithms defined in compression standard * Can sometimes encode/decode several compression algorithms * Popular compression algorithms include JPEG/MPEG family === H.264 Compression Algorithm === 1. Defined in MPEG-4 Part 10: Advanced Video Coding * Implements many common image/video compression techniques a. Discrete Cosine Transform (DCT) b. Motion Compensation === Scalable Video Coding === 1. Uses various methods to split a high quality bit stream into component layers * Temporary Scalability * Spatial Scalability * SNR Scalability 2. Decoder recombines selected layers to form output video stream [[Image(Scalable Video Coding.png)]] === Network Emulator Test Results === 1. Original File [[Image(original.gif)]] 2. 5% Packet Loss Simulation [[Image(5Loss.gif)]] 3. 8 Megabit Connection Simulation [[Image(8mbit.gif)]] 4. 12 Megabit Connection Simulation [[Image(12mbit.gif)]] === DASH Multi-Bitrate Encoding === * DASH - Dynamic Adaptive Streaming over HTTP * Encodes single video into multiple resolutions/bitrates * Video files are segmented in certain time intervals * Manifest file describes resolutions and bitrates of separately encoded files * Client and server monitor network connections and switch video files as necessary === DASH Content Generation === * Minimal existing software support for encoding/playing DASH content * Encode multiple copies of videos using ffmpeg * Generate manifest files using MP4Box and manually combine === Bitrate Profiles === 1. Camera Bandwidth Profile [[Image(Camerabandwidth.png)]] 2. Client Bandwidth Profile [[Image(Clientbandwidth.png)]] 3. Complete Bandwidth Profile [[Image(Completebandwidth.PNG)]] By analyzing both the camera and client bandwidth profiles, we can hopefully create a better stream quality by dynamically adapting to the network conditions and improving the quality where possible. == People == Professor Anand Sarwate[[BR]] Professor Roy Yates[[BR]] Danny Ayoub[[BR]] Kush Oza[[BR]]