-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handle 'JUNK' chunks when reading in wave files #17
Comments
Hi @ryjose1 That's a good catch, I had no idea! |
Do you mind trying against the latest master? Should have a fix in place! |
Sorry for the delay, life got a little busy. It looks like this file is weird because it also has 24 bits per sample; I'm no longer crashing when reading in the file, but it looks like I'm erroring out when writing the file back to disk for testing. For context, I'm taking the file and putting it through the following flow:
where
|
No worries about the delay, thanks for testing! I will look into writing the 24 bit files soonish. |
So this file is pretty odd.. in the "Fmt" part, it is larger than the expected size, but it doesn't report that..
That is the entire content of the Fmt Block. Per default, it should be 16b wide, unless it reports more. The bits that are supposed to report "more data" are set to 0 - yet it does have more than 16b. Very strange file indeed - I'll have to look a bit more into how I can handle such cases gracefully. |
For some context, this file isn't a blocker for me, though it's just my luck that the arbitrary test file I picked has so many quirks. Thanks for your help so far/going forward! |
I was trying to use this module to read in this wave file and was having some trouble.
It looks like before the "fmt" chunk, it's possible to have a chunk that begins with "JUNK" that should be skipped.
I believe it's described in this page talking about the "RIFF" format: https://www.daubnet.com/en/file-format-riff
The fix would go around here: https://github.com/DylanMeeus/GoAudio/blob/master/wave/reader.go#L60
Tangentially, there was a similar issue on the tensorflow project: tensorflow/tensorflow#26247
The text was updated successfully, but these errors were encountered: