05-10-2013 06:26 AM
"The average time between transmits which is calculated per second and then averaged over the time value. As a reference at 1G with full sized frames the minimum this value could be is 20.25us (a bit is transmitted every 941 picoseconds, full size frame with SOF, header, payload, CRC, & EOF is 2148 bytes, 21480 bits when encoded, and there is a requirement of at least 6 ordered sets between frames, each ordered set is 40 bits encoded, so a minimum of 21520 bits are transmitted)" Could someone help to clarify how 2148 bytes became 21480 bits? This i saw in the bottleneck detection best practices document.
05-14-2013 11:06 AM
2148 bytes becomes 21480 bits due to the 8b/10b encoding. Essentially each eight-bit data byte is converted to a
10-bit transmission character. This is applicable to pre-16G. I hope this helps.