You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have checked that this issue has not already been reported.
I have confirmed this bug exists on the latest version of Polars.
Reproducible example
importpolarsaspldf=pl.read_excel("data.xlsx")
df
Using data.xlsx, where the header is a date and the rest of the rows for columns 1, 2 and 3 it does not support a proper the difference between their dtypes and the display for df the log below.
Can someone help me with this issue?
The dtypes are correct, as they reflect the data, which is integer (not date). The default "calamine" engine (not unreasonably) expects column headers to be strings, not dates/floats/int/other (as that looks like data, not a header).
If you ensure that you convert your date-headers to text, everything will work as expected 👌 Alternatively you can use the xlsx2csv engine, as suggested by @cmdlineluser, but you'll lose a lot of speed in doing so.
Likely worth asking the calamine folks to consider this case, perhaps providing a string-coercion method when reading headers (which you could then pass when using the Polars method).
Checks
Reproducible example
Using data.xlsx, where the header is a date and the rest of the rows for columns 1, 2 and 3 it does not support a proper the difference between their dtypes and the display for df the log below.
Can someone help me with this issue?
Log output
Issue description
Issue when header dtype differ from the rest of the rows' dtype
Expected behavior
shape: (3, 4)
┌───────┬──────────────┬──────────────┬──────────────┐
│ Name ┆ 2024-01-01 ┆ 2024-02-01 ┆ 2024-03-01│
│ --- ┆ --- ┆ --- ┆ --- │
│ str ┆ Date ┆ Date ┆ Date │
╞═══════╪══════════════╪══════════════╪══════════════╡
│ John ┆ 1 ┆ 2 ┆ 3 │
│ Mary ┆ 2 ┆ 4 ┆ 6 │
│ Polly ┆ 3 ┆ 6 ┆ 9 │
└───────┴──────────────┴──────────────┴──────────────┘
Installed versions
The text was updated successfully, but these errors were encountered: