Does anybody have an idea how to use MT to simulate a higher latency, lower bandwidth connection?
I realize I’m going the opposite of most use cases (fast as possible vs. slow it down) but I imagine that there’s got to be a way to simulate a T1 or DSL line, both bandwidth shaping (which i can already do) and introducing synthetic latency (which I can’t do already.)
Any ideas?
Thanks in advance,
-Josh