-
-
Notifications
You must be signed in to change notification settings - Fork 335
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sending video meta data through direct video #808
Comments
Direct video mode is very non-standard use of HDMI. Not sure if such data is possible. |
Thank you for the explanation. I'll leave this ticket open so that other devs can chime in. About Retrotink, I belive that Mike Chi can elaborate further :) |
I already talked to Mike Chi about this a week ago as well. It shouldn't be a major problem as the community is allowed to develop presets and share them with each other for the Retrotink4k. The solution is to use these presets when they become available. It should only impact a handful of cores like the SNES core as well. The overlap of someone who can afford a Retrotink4k and wants to use the MiSTer with it will be a very small number of people anyways. |
The problem is that uses will then have to manually switch profiles whenever they switch core on mister. Adding metadata would make it work automatically. I for one would be very happy if this was added! |
Yeah, agreed. |
Another vote here for adding metadata. |
This would be a great feature. I'll be connecting my mister to my retrotink 4k (once it arrives) |
Should totally do it. 100% agree. Wonder why it was never done when Mike Chi asked over 6 months ago and decided to support the next platform of FPGA gaming (MARS). |
We're seeing a pretty constant flow of RT4K owners complain about bad results with MiSTer direct video due to the missing metadata, this would make things go a lot smoother for MiSTer users who want to do further processing of the digital signal. This problem will only get worse once the Morph 4K and OSSC Pro both hit general availability. |
There is an overlap between RetroTINK 4K users and MiSTer users, as can be seen by the feedback on Discord and social media. Both are solutions for people who want a great experience from classic games and are willing to spend a bit extra for it. External scalers like the RT4K or OSSC Pro can improve MiSTer's video output by providing better deinterlacing and higher resolution output. Metadata would make the experience a lot smoother. |
For the record, Mike Chi didn't ask any MiSTer developers that I'm aware of, Toya asked on his behalf. I reached out to Mike Chi directly and (after some back and forth understanding what he needed) he said, "I just want to be able to say I brought it up" so it didn't seem like a very high priority. He told me that community made profiles could be shared and this could address it. This was 4 months ago, not 6 months ago. Despite the rude thumbs-downs from people randomly necro'ing this issue months later, I did bring this up in an internal development channel in discord, I listened to Mike Chi and I passed on the message in a proactive way after trying to gather information and help. Also I let Mike Chi know that he should contact sorgelig on facebook directly and then submit a PR to get it added, but he wasn't interested. |
Part of the issue that I have recently learned is that the direct video mode for HDMI is not sending a “raw” untouched signal. Apparently there is pixel repetition and super resolution being applied. Sounds like we need another option for external scalers to receive truly raw data via digital HDMI. |
This was by design so it can work with cheap AG620x DACs, read above. Direct video mode wasn't designed to work with a $750 scaler that was released 4 years later. |
Fair enough, so this is a feature request. And I would think it would be helpful for all scalers…not just RT4K. |
What other scalers would this help out with that you know of? |
The OSSC Pro and PixelFX Morph 4K immediately come to mind, but any scaler with HDMI input, past/present/future, could potentially benefit. The reason why this is coming up now (the "necro") is because the three scalers mentioned are beginning to ship to actual customers, who are turning around and complaining to the scaler developers when they get unexpected results with mister direct video mode. |
Why are you guys still obsessed with AG6200 junk? These DACs are atrocious. |
So far the direct video works great. There's really only two issues:
The only metadata that's needed is the de-repetition factor and how to adjust the crops from what is presented by the HDMI DE signal. Probably less than 10 bytes of data. Maybe packing the data into an HDMI infoframe would work. The actual HDMI video itself does not need to be changed at all. If one wanted to get extra fancy, the crops could even contain data on how to overcome the console's overscan/padding (for example cut out the unused space in Master system), but this might be more work than is desirable. This would allow the scaler to have a 1:1 replica of the console's frame buffer and be a much simpler, automated user experience. |
It would just be a few bytes of data, but calculating it and implementing that for every core is an unknown amount of work. Cores don't typically provide any meta data, they just generate a video signal and so it would need to be inferred in the sys framework, passed back to main and then placed into an infoframe. I implemented core name metadata and gave a version to various scaler developers to try out. It's not ideal but it would allow for some sane presets. It would be possible to remove the extra blanks when the output mode is RGB, since it is there to better support YPbPr. However when I did that the first line was flickering in some resolutions. It's hard to know whether that is an issue with mister code, the DE-10 HDMI IC or the HDMI receiver chip. Without some kind of HDMI analysis device it's just guess work. |
The only viable option is possible is output with real blanking signals. |
I ASKED MIKE CHI TO LEAVE A COMMENT. IT WAS ME, I TALKED TO MIKE CHI. |
We are talking about raw digital video out the HDMI port, so no DAC should be involved as far as I am aware. |
Not sure about de-repetition. Need to investigate if it won't increase complexity if code too much. Also i'm not sure if ADV7513 allows to output lower clock. If it will allow, then probably without audio. Anyway, i don't have equipment to test such video output. AG6200 won't work with low pixel clock.
To claim that, you have to tell to which competitors you are comparing. So far AG6200 is the only chip supporting any resolution and providing great analog output with 256 levels per channel. |
https://docs.google.com/spreadsheets/d/1nbepvFFBVsLrs1myOiVWqMVLp9-oB9TataRmVlcyqlA/ |
And? Where is the test among all cores? SNES produces kind of ideal video, so it's not a good test bench. Many cores have non-standard resolutions and refresh rates where many chips will fail. Also prices should be shown. Device providing subtle difference but costing 20 times more won't be a good competitor. |
Icybox DAC is less than 20$. How can something be a competitor when it crushes blacks and has voltages all over the place? |
Who is responsible for adding pixel repetition and extra blanks and stuff .. is that done by cores themselves or by the mister framework that cores are built on top of ? |
It’s part of the framework |
In that case the framework already know about the original resolution .. so no core changes would have to be made in order to add a new "really direct video (for real and unmodified)" output option that just outputs the raw pixels ? As long as the ADV7513 supports encoding theese "out of hdmi spec" resolutions ? |
Is it currently confirmed that the Morph4k and the OSSC Pro will require this metadata to make this work and that the developers of those two scalers will use it if provided? Sorry for the probing questions, but it's important to establish how strong the case is for adding something, I assure you that I'm asking in good faith here.
Dirt cheap, readily available, known to work, etc... There is no obsession, it's just what was easily available to all users to purchase online if they wanted vga output without an analog io board when this was designed.
Regarding the competing DACs from the google spreadsheet that has been making the rounds... the only ones that are as readily available and in the same price range also already work fine with the current method without modification to the framework (aka ones that use the LT8621SX, CS5210, IT6902, IT6892FN, and IT6604E), so I don't see the point in complaining about the AG620x in relation to this feature request, there are plenty more you can buy that have better levels already, if you want. This is wholly irrelevant to the point of this feature request though. I agree with sorgelig regarding the skepticism of the testing methodology, testing merely the SNES core is a bit strange. The reference levels on the second tab cited are from a real SNES using component and not composite, I assume? Shouldn't it be a test of composite instead as that is how a real SNES operated on real CRTs contemporaneously? Or shouldn't the reference have been a system that output RGB natively if you are going to compare against RGB output? Or rather... is it a valid assumption to assume that one SNES modded for YPbPr should match RGB output of a MiSTer core? Please forgive my ignorance on this, not trying to discredit it, but that part seemed odd to me personally. Regarding the DAC testing, did they have hdmi_limited=0 or hdmi_limited=2 set in their MiSTer.ini? You can see that in the component voltage tests the reference range is 16-255, so I hope hdmi_limited=2 was used for these tests of the AG620x since that option was added specifically to address this. If hdmi_limited=0 was set then this would explain the crushed blacks on the AG620x, and if that were the case then that is just a case of user error. The testing should be updated using the proper intended settings if that is the case. |
I tested SNES and TG16 cores and aside from TG width having some extra slack (e.g. 360px instead of 320), positioning is working OK with cores. menu.rbf should be also updated since it's now claiming DV1 with invalid offset (de_h=de_v=0). I have to subtract 1 from de_v to fix vertical offset on cores so value of 0 causes issues (for now I have a temporary check for it). |
Actually TG16 de_v seems to be off by another line compared to the other cores. Not sure what's going on. |
make sure you count lines the same way as i mentioned earlier. If you count lines differently you may get off by 1 line depending on HSync position. |
For menu core i strongly suggest to turn off DirectVideo mode. Menu works much better in normal HDMI output. |
Number of lines is counted from trailing edge of vsync, but there is implicit assumption that vsync edges are aligned with hsync leading edges for non-interlace. This might explain the offset although I would not expect 2 line deviation as seen with TG16 core. Anyway, it's probably not a huge deal for most users so the issue can be closed from my point of view. |
I don't understand where are you testing and how. Grid is not always supposed to have the same border area around. Some systems like TGFX have variable blanks. MiSTer just provides what system originally gives. |
Sorry for the late response. I was not aware of variable blanks, so maybe that is the answer. Again, all the grid tests are perfectly centered as far as I can tell except for PCE 512x240 and MD 320x240. For example, in the PC Engine screenshot, the left side shows more of the white area than the right. It is as-if the entire active area needs to be shifted to the left. I just assumed it should be centered like all the other resolution grid tests appear to be. I would test a 512x240 PC Engine game, but I am not sure which games support this. Do you know of an example? I did have someone on the MiSTer Discord test original hardware and they claimed the 512x240 grid test was indeed centered. I don't have a picture from them though and I do not have original hardware to test myself. The Mega Drive picture above has the same issue (active area seems like it needs to be shifted to the left). It looks like it is only off center by maybe a few lines of resolution. You can see the left side is slightly thicker than the right. I notice it in Sonic the Hedgehog with the blue borders, so it definitely shows in 320x240 games when displaying the overscan area. Most people tend to crop out the overscan area, so they probably don't notice. Just figured this was worth reporting since we are in the testing phase still. :) Edit: This could be bugs with the 240p test suites as well. |
I've noticed that, using RT4K + MiSTer DV1, the menu core exhibits some odd behavior switching between the static snow effect and using a wallpaper image. When using static, the MiSTer seems to be sending 1174x240, and when using a wallpaper, the MiSTer seems to be sending 731x240. Both get decimated by a factor of two, down to 587x240 and 365x240 respectively. This behavior is correct when using static, as it appears from my observation to to use a pixel repetition factor of 2. However, the wallpaper mode does not repeat pixels, so this maintained decimation factor of two results in the menu and wallpaper both getting mangled. Correct me if I'm making some hasty assumption, but to me this sounds like the MiSTer is not correctly updating the DV1 data from a pixel repetition factor of 2 to a pixel repetition factor of 1 when you switch from static to wallpaper. |
I've also experienced the same issue with the wallpaper mode. |
Menu core should be used in normal HDMI mode, not DV as i've already mentioned. |
@memmam @cobhc2019 Here's the documentation specifically on how to add that to your MiSTer.ini: https://mister-devel.github.io/MkDocs_MiSTer/advanced/ini/#adding-core-specific-settings |
I have a consistent and detailed step-by-step repro of a specific MiSTer_unstable + DV1 + RetroTink-4K issue. I've already discussed it with folks in the MiSTer Discord, and they suggested I report it here. Should I really dump my detailed issue report into this lengthy issue thread, or should I file a new separate issue so it can have its own detailed discussion thread? Please advise. |
I have decided to write up and file the detailed repro in Jotego's GitHub, since his cores appear to trigger the issue. You can read about it here: jotego/jtcores#535 It is concerning that MiSTer remains irrecoverably "stuck" in a bad state (in which DV1 no longer functions correctly) after loading one of Jotego's cores. As a user, my expectation is that a warm reboot, or possibly a cold reboot, should always put MiSTer back into a good state after a misbehaving core has caused problems. Having to hard cycle the power is an undesirable way to get back to a good working state. |
One of the issues is that Main does not write SPD data when loading a non-DV core. (Menu core with directvideo=0 for example) so it still uses the old SPD data of the previous core. A solution is to clear SPD data in The other issue is that some of Jotego's cores have trouble with HDMI and i2c. This error is in the log: |
What would be required to enable cores to pass rotation information along in the |
There's two kinds of rotation.
|
Screen flip (mirroring X and Y) is not always supported either, as some hardware requires software changes for it.
The core knows whether the game is horizontal (yoko) or vertical (tate) and whether rotation needs to be +90º or -90º. I do not know if the RetroTink can do something with that or how to pass that information, though. |
How does the core know? Is it hard-coded inside the core, in the MRA file, or both? If the information is in the MRA file, could MiSTer_Main parse it and use it to populate the DV1 data? |
All JT cores have this information encoded this way. This is not a standard MiSTer feature, though. |
Devs in Discord also pointed out the It sounds like two different people identified a need for the same information to be in the MRA and created two different ways of putting it there. One of those ways is well-structured but unofficial, and the other way is unstructured but official. Maybe the first step should be to define an official way that is also well-structured. Or maybe that could be a longer-term goal. For now, maybe MiSTer_Main could look in the MRA for both ways and use whichever it finds. |
The A firmware solution based on |
A concern was expressed in Discord about creating the need to update a bazillion MRAs. I don't imagine that should be necessary. I'm thinking this enhancement would be best-effort. Meaning, DV1 would only suggest a default orientation to the scaler if the MRA provides one. The user could still manually set orientation in the scaler itself. Would using the MRA info for this enhancement be architecturally appropriate? |
Just leave it to the user and scaler. Especially users may have different physical screen orientation. It's not a problem for user set it for specific core and forget. |
Okay. It would be nice, but as you said, it's not essential. |
Hi all, For the next couple of questions, I'm referring to that format: Lines 1561 to 1574 in 09d301f
First question: does a pixel repetition setting in the spdif stack up with the pixel prepetiton value in the aviif? Meaning that a 2x aviif and 2x spdif gives you a total decimation of 4x? At the moment it is implemented that way on the Tink4k. Second question: I was playing with the horizontal setup. Thank you! |
I don't know who you address your question. MiSTer uses i2s connected to ADV7513, not spdif. |
Sorry for the unprecise abbreviation: I mean Source Product Description Infoframe (spdif) and not the audio protocol! In example, earlier in this thread it is being said that de_h counts the pixel clocks after the sync impulse however, I found that the tink4k counts after DE signalling. Did I miss something while scanning this thread? And second question is about interpretation of pixel repetition value send through SPD-Infoframe. Shoud it overwrite what is actually send with Auxiliary Video information InfoFrame (aviif) or stack up. This seems to be unclear in this thread or I didn't found it? What is the common understanding here. I just to have a common understanding in the community of what is actually develop within this thread. Edit: ok, there was one information I was missing while reading. So, |
Hi developers,
I'm reaching out to you requesting the possibility of adding metadata information through direct video so that external hardware such as scalers can use that information. A use case is using a retrotink 4k where it takes metadata with the direct video on how to exactly crop and scale the game.
Thanks
The text was updated successfully, but these errors were encountered: