Output ("echo") a variable to a text file

The simplest Hello World example...

$hello = "Hello World"
$hello | Out-File c:\debug.txt

Note: The answer below is written from the perspective of Windows PowerShell.
However, it applies to the cross-platform PowerShell (Core) v6+ as well, except that the latter - commendably - consistently defaults to BOM-less UTF-8 as the character encoding, which is the most widely compatible one across platforms and cultures.
.


To complement bigtv's helpful answer helpful answer with a more concise alternative and background information:

# > $file is effectively the same as | Out-File $file
# Objects are written the same way they display in the console.
# Default character encoding is UTF-16LE (mostly 2 bytes per char.), with BOM.
# Use Out-File -Encoding <name> to change the encoding.
$env:computername > $file

# Set-Content calls .ToString() on each object to output.
# Default character encoding is "ANSI" (culture-specific, single-byte).
# Use Set-Content -Encoding <name> to change the encoding.
# Use Set-Content rather than Add-Content; the latter is for *appending* to a file.
$env:computername | Set-Content $file 

When outputting to a text file, you have 2 fundamental choices that use different object representations and, in Windows PowerShell (as opposed to PowerShell Core), also employ different default character encodings:

  • Out-File (or >) / Out-File -Append (or >>):

    • Suitable for output objects of any type, because PowerShell's default output formatting is applied to the output objects.

      • In other words: you get the same output as when printing to the console.
    • The default encoding, which can be changed with the -Encoding parameter, is Unicode, which is UTF-16LE in which most characters are encoded as 2 bytes. The advantage of a Unicode encoding such as UTF-16LE is that it is a global alphabet, capable of encoding all characters from all human languages.

      • In PSv5.1+, you can change the encoding used by > and >>, via the $PSDefaultParameterValues preference variable, taking advantage of the fact that > and >> are now effectively aliases of Out-File and Out-File -Append. To change to UTF-8 (invariably with a BOM, in Windows PowerShell), for instance, use:
        $PSDefaultParameterValues['Out-File:Encoding']='UTF8'
  • Set-Content / Add-Content:

    • For writing strings and instances of types known to have meaningful string representations, such as the .NET primitive data types (Booleans, integers, ...).

      • .psobject.ToString() method is called on each output object, which results in meaningless representations for types that don't explicitly implement a meaningful representation; [hashtable] instances are an example:
        @{ one = 1 } | Set-Content t.txt writes literal System.Collections.Hashtable to t.txt, which is the result of @{ one = 1 }.ToString().
    • The default encoding, which can be changed with the -Encoding parameter, is Default, which is the system's active ANSI code page, i.e. the single-byte culture-specific legacy encoding for non-Unicode applications, which is most commonly Windows-1252.
      Note that the documentation currently incorrectly claims that ASCII is the default encoding.

    • Note that Add-Content's purpose is to append content to an existing file, and it is only equivalent to Set-Content if the target file doesn't exist yet.
      If the file exists and is nonempty, Add-Content tries to match the existing encoding.

Out-File / > / Set-Content / Add-Content all act culture-sensitively, i.e., they produce representations suitable for the current culture (locale), if available (though custom formatting data is free to define its own, culture-invariant representation - see Get-Help about_format.ps1xml). This contrasts with PowerShell's string expansion (string interpolation in double-quoted strings), which is culture-invariant - see this answer of mine.

As for performance: Since Set-Content doesn't have to apply default formatting to its input, it performs better.


As for the OP's symptom with Add-Content:

Since $env:COMPUTERNAME cannot contain non-ASCII characters (or verbatim ? characters), Add-Content's addition to the file should not result in ? characters, and the likeliest explanation is that the ? instances were part of the preexisting content in output file $file, which Add-Content appended to.


After some trial and error, I found that

$computername = $env:computername

works to get a computer name, but sending $computername to a file via Add-Content doesn't work.

I also tried $computername.Value.

Instead, if I use

$computername = get-content env:computername

I can send it to a text file using

$computername | Out-File $file

Your sample code seems to be OK. Thus, the root problem needs to be dug up somehow. Let's eliminate chance for typos in the script. First off, make sure you put Set-Strictmode -Version 2.0 in the beginning of your script. This will help you to catch misspelled variable names. Like so,

# Test.ps1
set-strictmode -version 2.0 # Comment this line and no error will be reported.
$foo = "bar"
set-content -path ./test.txt -value $fo # Error! Should be "$foo"

PS C:\temp> .\test.ps1
The variable '$fo' cannot be retrieved because it has not been set.
At C:\temp\test.ps1:3 char:40
+ set-content -path ./test.txt -value $fo <<<<
    + CategoryInfo          : InvalidOperation: (fo:Token) [], RuntimeException
    + FullyQualifiedErrorId : VariableIsUndefined

The next part about question marks sounds like you have a problem with Unicode. What's the output when you type the file with Powershell like so,

$file = "\\server\share\file.txt"
cat $file