Interference Between SWOS and Real-Time Morse Code Transmission Over MikroTik Router

Hi all,

I’m currently experimenting with transmitting real-time Morse code signals (converted from text to audio tone streams) over a wireless setup managed by a MikroTik router. The setup uses SWOS for managing wireless links between multiple nodes. The Morse code tones are generated on a local web application and streamed as low-bitrate audio packets between two endpoints within the same overlay network.

The issue I’m facing is inconsistent timing and jitter in the Morse code playback on the receiving side. Since Morse code relies heavily on precise timing (dot/dash duration and spacing), even small latency variations are causing decoding errors. I’ve noticed that when SWOS is enabled for traffic management and overlay routing, packet delay variation seems higher compared to a direct bridge configuration. CPU load on the router is moderate, and bandwidth usage is minimal, so congestion doesn’t appear to be the main cause.

I’ve already tried adjusting queue types (default vs. fq_codel), disabling fasttrack for the specific traffic, and prioritizing the UDP stream via mangle rules and simple queues. However, the timing distortion persists intermittently, especially when multiple wireless clients are connected to the overlay.

Has anyone experienced similar latency or jitter issues when running real-time, timing-sensitive applications over SWOS on MikroTik devices? Are there specific wireless settings (ACK timeout, frame aggregation, WMM, or NV2 tuning) that could help stabilize microsecond-level timing consistency? I would appreciate any recommendations on optimizing RouterOS configuration for low-latency, time-critical signal transmission.

Small nitpick, if it runs SwOS it is a switch and not a router.
(or there are more devices involved, a router running RouterOS and a switch running SwOS?)
Maybe you could post the actual model(s) you are using and a quick sketch of the topology of your network.

What you seem to be doing might be going outside of the reasonable expectations of TCP/IP.

Sophisticated stuff like internet radio, internet telephony, video etc depends on the application to sort out timing and jitter. That is to say that the sending application encodes the timing and order of the packets and incorporates measures such as cyclic redundancy to cover the possibility of missing packets. The receiving end decodes the timing and order of those packets and reconstructs missing data. There are no guarantees of timing and order of packets within TCP/IP itself.

Note that digital TV and radio incorporate a 6 second delay for the protocols to maintain data integrity, which is probably a good guide as to what real time might look like over TCP/IP

If your programs do not incorporate such measures, you cannot reasonably expect integrity in such transmission over a TCP/IP network. You should use an established protocol suite, such as that for telephony or some other system that maintains integrity. Or you might need to develop a protocol of your own to maintain timing integrity - which for Morse Code could be extremely low bit rate compared to any off the shelf protocol.

There are seemingly a couple of specific protocols for Morse code transmission over TCP/IP, moip and mopp:

it has to be seen if they are suitable for OP's needs.

When you say wireless do you mean wifi? If the transmission is over wifi I would not expect it to work perfectly without some sort of mechanism to account for wifi’s variable nature.