Hi, I am writing my master thesis and is in contact with a digital signage company where im writing about the distribution of large amount of data. I need some ideas or some documented experiences with transfering large amount of data (this is images and video, ~100Mb - ~1Gb - but any data would do, large datasets would give some of the same problems) to multiple clients.
Does anyone know of a method i could look into on how to approach this in a structured manner, or at least point me in a direction (other thesis, books, papers, people).
My main approach right now is to resolve a few things: 1. How can i make sure data is intact when it arrives (not corruptet, the .png will still work) 2. How can i determine if i received all data? 3...?
Any input is welcome, the current approach is streaming via WebServices, im going to look into BitTorrent aproach (P2P), but this seems not to be a proper strategy since each client can be showing different content.
Could any of you guys out there work for a digital signage company tell me a bit on how your approach for this is? Or if you have any experience on moving large datasets from server to client... what is your approach?