Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update to Isabelle 2023 #670

Merged
merged 15 commits into from
Oct 6, 2023
Merged

update to Isabelle 2023 #670

merged 15 commits into from
Oct 6, 2023

Conversation

lsf37
Copy link
Member

@lsf37 lsf37 commented Sep 27, 2023

Updates all proofs and ML code to Isabelle2023.

Changes are fairly minor this time. A few tweaks in ML where functions get more parameters (location info and similar). Mostly these are filled with None or default values unless something is specifically available to put in.

On the proof side, the only larger change is that the syntax precedence of the "mapsto" arrow has changed from
f t (x |-> y) to (f t) (x |-> y), which is arguably clearer and easier to read in most cases.

Haven't noticed any major performance impact, but we'll see more comparable numbers when the AWS tests come in.

@lsf37 lsf37 marked this pull request as draft September 27, 2023 01:36
@lsf37
Copy link
Member Author

lsf37 commented Sep 27, 2023

(Setting this to draft for now, because we should merge the other larger PRs first and then rebase this one)

@lsf37 lsf37 force-pushed the isabelle-2023 branch 3 times, most recently from decdaae to 2cbf067 Compare October 5, 2023 21:58
@lsf37 lsf37 self-assigned this Oct 5, 2023
@lsf37 lsf37 marked this pull request as ready for review October 5, 2023 21:58
@lsf37
Copy link
Member Author

lsf37 commented Oct 5, 2023

This is now ready for review. The last commit needs to be dropped before merging (it just makes the tests pick Isabelle2023 instead of Isabelle2022 for the PR).

Copy link
Member

@corlewis corlewis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me!
My only comment is that a lot of the commits could be squashed together, since I don't think we get any benefit from the mapsto syntax updates being per directory.

@lsf37
Copy link
Member Author

lsf37 commented Oct 6, 2023

Looks good to me!
My only comment is that a lot of the commits could be squashed together, since I don't think we get any benefit from the mapsto syntax updates being per directory.

That makes sense, the current set is mostly how they were constructed. I'll squash them into larger sets.

Copy link
Contributor

@michaelmcinerney michaelmcinerney left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, just a few minor nitpicks

lib/Word_Lib/Word_Lemmas.thy Outdated Show resolved Hide resolved
lib/Word_Lib/Word_Lemmas.thy Outdated Show resolved Hide resolved
lib/Word_Lib/Word_Lemmas.thy Outdated Show resolved Hide resolved
(s\<lparr>kheap := kheap s(thread \<mapsto>
TCB (tcb'\<lparr>tcb_arch := arch_tcb_context_set c' (tcb_arch tcb')\<rparr>))\<rparr>)"
(s\<lparr>kheap := (kheap s)
(thread \<mapsto> TCB (tcb'\<lparr>tcb_arch := arch_tcb_context_set c' (tcb_arch tcb')\<rparr>))\<rparr>)"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I'd prefer the previous indentation

The code draws in table.ML from the Isabelle source, which changed
in the 2023 release. This commit adds further library functions from
Isabelle library.ML and extracts the parts of unsynchronized.ML that
work with mlton.

Signed-off-by: Gerwin Klein <[email protected]>
@lsf37 lsf37 merged commit ad24d95 into master Oct 6, 2023
9 of 14 checks passed
@lsf37 lsf37 deleted the isabelle-2023 branch October 6, 2023 04:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants