Okay, imagine I am broadcasting a 36Mbps UDP stream (54Mbps air rate), multicast PtMP. The client units are each recieving the whole stream and picking up whatever channel they're looking for via multicast address. What would happen if one of those clients were to have a bad signal, would it slow down all of the other clients?
I suppose I could force 48/54 Mbps data rate on the tower side, then they simply would lose signal, right? Basically this would be one-way real time video broadcast, so if they lose signal there's no catchup, just lost packets.. (and a waiting for signal screen)
Now to the client side... if I use Routerboards for my clients, and assuming each channel takes up 2 Mbps, even though I'm broadcasting faster than the board can process whole packets, could it pick up just the multicast packets that are destined for it without losing them? I know a routerboard can't process more than 24 Mbps, but would it be able to just ignore the extra packets that it is not waiting for?
I'm hoping that the multicast UDP packets will not incur an ACK response from EACH client on every frame, and rather have the client silently repeat packets to the ethernet side... Kinda like my Dish Network system doesn't tell the satellite it got the TV signal. That would eat up my air time badly.
This is all using 5ghz, probably N-Streme. And I'm sure I'm the first person to try this