Apple - Should I use a Thunderbolt adapter or a USB 3.0 adapter for ethernet?

SHORT ANSWER

The answer to your question as asked in the title is it really depends on what ports you have available, personal preference, cost, etc.

I say this because both USB 3.0 and Thunderbolt are faster than ethernet, so it doesn't matter which way you go from a speed point of view.

More specifically (and at the risk of oversimplifying it):

  • Ethernet supports up to 1Gbps*
  • USB 3.0 supports up to 5Gbps
  • USB 3.1 supports up to 10Gbps
  • Thunderbolt 1 up to 10Gbps
  • Thunderbolt 2 up to 20Gbps
  • Thunderbolt 3 up to 40Gbps

*In the overwhelming majority of cases, although 10Gbps ethernet networks do exist.

So, as you can see, it doesn't matter because they're all faster than the ethernet you're converting to.

LONG ANSWER

I'm adding this longer answer due to the various comments below my original answer.

For starters, ethernet can actually support up to 10Gbps. However, in 99.9% of cases (okay, I can't cite a source for that figure - I'm just making a point) this will not be a practical consideration for users unless they intend to connect to an existing 10Gbps ethernet network. While 10Gbps ethernet is starting to gain some traction in terms of network infrastructure, this is only happening in some of the largest organisations or those that have a particular need for this type of setup (such as ISPs, Cloud providers, data centres, etc). It is also worth noting that Apple has never launched a computer (not even Mac Pros or Servers) that natively support 10Gbps ethernet.

One of the reasons for a slow take up rate of 10Gbps ethernet is that it requires full duplex point-to-point links (typically via network switches) and as a result half duplex operation and repeater hubs do not work in 10Gbps ethernet networks. So converting existing ethernet networks to a 10Gbps ethernet network is no trivial matter and is quite expensive. All that said, I expect the deployment of 10Gbps ethernet networks to really start taking off more broadly due to the demands of HD video editing and the requirement of more organisations to have high-performance shared storage systems.

But in terms of typical consumers, this is not something worth considering when wanting to add an ethernet port to a computer and making a decision on the type of adapter they need.

A word about latency

A lot has been made about latency in the comments. While latency is a factor - especially when large networks with many network devices are involved - it's less of an issue for typical consumers.

Does latency matter to typical consumers?

Yes and no. A user on a home network who needs to transfer some photos and documents from a MacBook to an iMac is not going to be too concerned if it takes a couple of seconds for the transfer to commence. On the other hand, if the same user is browsing the web and it takes a couple of seconds for a page to start loading, that can be enough for them to move on to something else. So, latency can be very important to the overall user experience, but how important it is also depends on the application. If we spend hours on the internet we want our pages to load quickly, and latency can definitely affect this (just talk to any Satellite internet user). On the other hand, if we only transfer files across a home network occasionally, it's less important.

So, what is latency?

At the risk of oversimplifying things, latency refers to the delay in transmission time that occurs while data remains in a device's buffered memory (e.g. bridge, router, etc) before it can be sent along its path. While it seems to only be hardware related, latency is in fact affected by both hardware and software factors. Some are listed below:

Hardware factors

  • Traversing the network medium
  • Traversing network switches and devices
  • Transmission through the PCIe bus
  • Memory access times
  • Length of network cables
  • Etc etc

Software factors

  • Firmware running on the adapter
  • The device driver controlling the adapter
  • Operating system execution
  • The portion of the network stack that data has to transmit over
  • Etc etc

Regardless of the factor involved, the impact of latency on network bandwidth can be temporary or persistent.

How is latency measured?

In terms of ethernet networks, latency can be measured with different tools and methods, such as specified by IEEE RFC2544, netperf, or Ping-Pong (no, not the table tennis game). Put very simply, the main difference in these various methods is the point at which latency is measured. Regardless though, while excessive latency can limit the performance of network applications by delaying data arrival, this delay in a typical consumer network is less likely to be noticeable because there aren't usually too many network devices in consumer networks. That is, because there are less adapters, bridges, routers, etc involved between the source and destination, the total latency should be less. While users can do pings and traceroutes to measure this delay, in real world home applications (e.g. transferring files) it's not going to be noticeable unless there is a problem somewhere.

So, is latency a factor in determining the type of ethernet adapter one purchases?

Yes and no. In a sense this is irrelevant in a small/home network because there just aren't going to be many network devices. But if you have to make a decision on which type of adapter to purchase for 6 computers per room in a building of 10 rooms where all the computers are on the same ethernet network connected by multiple bridges etc, then it is much more relevant.

Thunderbolt v USB 3 re latency

So, which type of adapter is better in terms of latency? Generally, a Thunderbolt to Ethernet adapter is likely to have a lower latency then a USB 3 to Ethernet adapter. But, as manufacturers focus on bandwidth or throughput when they publish specs, you're not going to find it easy to try and quantify this or compare adapters.

So, why would I prefer a Thunderbolt to Ethernet adapter? To be honest, in a small/home network I probably wouldn't as I think the difference would be negligible and unnoticeable to the naked eye (so to speak). For me, the choice would come down to what ports I have available (or am willing to sacrifice) and the cost. But if it was a large network my preference for Thunderbolt is based on the real world experiences of users in particular fields.

For example, in the music production industry users have found that with audio devices capable of being connected either through Thunderbolt or USB3, that the overall audio latency of the connection is about 1ms for Thunderbolt and 4.5ms for USB 3. Now, these speeds can be impacted by other factors, but since these setups involve the exact same equipment, it appears that for whatever reason the Thunderbolt connection is faster (probably because Thunderbolt is allowed almost straight access to the CPU).

Whether this difference would be replicated in terms of a typical ethernet network is unclear. By that I mean connecting a PC to specialised audio equipment directly via Thunderbolt is different to connecting a PC to an ethernet network via a Thunderbolt or USB 3 adapter. Even if it was replicated, while audio latency may be noticeable to music professionals, the transfer of files and documents is different again.


I would recommend Thunderbolt, as it is essentially external PCI-Express, which is the same bus an internal network card (among other things like graphics cards, etc) is attached to.

PCI-E (and thus Thunderbolt) support DMA, which allows the network card to write packets to the system's memory directly without involving the CPU. USB as far as I know does not support DMA and will require cooperation from the CPU to copy every single network packet to memory.


An answer from my personal experience: I've used both

  • original Apple's Thunderbolt to Gigabit Ethernet adapter
  • Cable Matters DB50 USB 3.0 to Gigabit Ethernet adapter

and noticed no difference when testing for speed nor in daily use.