Amazon Athena - Column cannot be resolved on basic SQL WHERE query

I noticed that the csv source of the original table had column headers with capital letters (X and Y) unlike the column names that were being displayed in Athena. So I removed the table, edited the csv file so that the headers were lowercase (x and y), then recreated the table and now it works!


In my case, changing double quotes to single quotes resolves this error.

Presto uses single quotes for string literals, and uses double quotes for identifiers.

https://trino.io/docs/current/migration/from-hive.html#use-ansi-sql-syntax-for-identifiers-and-strings

Strings are delimited with single quotes and identifiers are quoted with double quotes, not backquotes:

SELECT name AS "User Name"
FROM "7day_active"
WHERE name = 'foo'

I have edited my response to this issue based on my current findings and my contact with both the AWS Glue and Athena support teams.

We were having the same issue - an inability to query on the first column in our CSV files. The problem comes down to the encoding of the CSV file. In short, AWS Glue and Athena currently do not support CSV's encoded in UTF-8-BOM. If you open up a CSV encoded with a Byte Order Mark (BOM) in Excel or Notepad++, it looks like any comma-delimited text file. However, opening it up in a Hex editor reveals the underlying issue. There are a bunch of special characters at the start of the file:  i.e. the BOM.

When a UTF-8-BOM CSV file is processed in AWS Glue, it retains these special characters, and associates then with the first column name. When you try and query on the first column within Athena, you will generate an error.

There are ways around this on AWS:

  • In AWS Glue, edit the table schema and delete the first column, then reinsert it back with the proper column name, OR

  • In AWS Athena, execute the SHOW CREATE TABLE DDL to script out the problematic table, remove the special character in the generated script, then run the script to create a new table which you can query on.

To make your life simple, just make sure your CSV's are encoded as UTF-8.