Hello everyone,
I am always confused about two parameters about streaming media -- fps (frame per second) and bit rate (e.g. 256k bps, 512k bps, etc.).
I do not want to know the mathematical and algorithm internals, and I just want to make it clear whether my following understanding is correct.
I think 15 fps and 256 bps, means for each second, the server will send 15 frames and the 15 frames aggregate to be 256k bits to client side. Frames could be any frame, key frames (I mean I-Frame in MPEG) or non-key frames. The bits used for represent each of the 15 frames in one second need not be the same, the total amount to be 256k bit should be fine. Is that understanding correct?
thanks in advance, George