What's the difference between UTF8/UTF16 and Base64 in terms of encoding

UTF-8 and UTF-16 are methods to encode Unicode strings to byte sequences.

See: The Absolute Minimum Every Software Developer Absolutely, Positively Must Know About Unicode and Character Sets (No Excuses!)

Base64 is a method to encode a byte sequence to a string.

So, these are widely different concepts and should not be confused.

Things to keep in mind:

  • Not every byte sequence represents an Unicode string encoded in UTF-8 or UTF-16.

  • Not every Unicode string represents a byte sequence encoded in Base64.


Base64 is a way to encode binary data, while UTF8 and UTF16 are ways to encode Unicode text. Note that in a language like Python 2.x, where binary data and strings are mixed, you can encode strings into base64 or utf8 the same way:

u'abc'.encode('utf16')
u'abc'.encode('base64')

But in languages where there's a more well-defined separation between the two types of data, the two ways of representing data generally have quite different utilities, to keep the concerns separate.


UTF-8 is like the other UTF encodings a character encoding to encode characters of the Unicode character set UCS.

Base64 is an encoding to represent any byte sequence by a sequence of printable characters (i.e. AZ, az, 09, +, and /).

There is no System.Text.Encoding.Base64 because Base64 is not a text encoding but rather a base conversion like the hexadecimal that uses 09 and AF (or af) to represent numbers.