Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Refactor approach to code generation, improve test suite. #107

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

techdragon
Copy link
Contributor

!!!!! DO NOT MERGE THIS PULL REQUEST !!!!!

I'm opening this pull request to get feedback on the additional tests I've started to add in my branch since it has started to grow to the point where the structure of the test cases in modules, sub modules and classes has started to become relevant.

I've marked some tests as skipped due to limitations of the simple test method I used to get all these added in this initial work, and I've marked a few items as expected failures when they test things that returned as NotImplemented.

I have added all the test case files that seemed appropriate from the W3C test case repository but so far I have only added the simple "can generate client code" tests for the WSDL documents. I have yet to start on the WSDL Server tests or the XSD to Python Tests.

The breadth of these tests has me pondering a significant change to the code generation as well. At the moment I'm exploring the use of the RedBaron library to enable a transition from the current templated generation approach, to one where templates are used to generate small segments of code, such as a class, function, or variable, and these are brought together by relying on RedBaron to handle the task of ensuring we generate a valid AST and python code from that AST.

Thoughts?
Feedback?

P.S. - PLEASE DO NOT MERGE THIS PULL REQUEST

@ngnpope
Copy link
Member

ngnpope commented May 10, 2017

It is a shame that these are only available from W3C's CVS - would have been nice to link the test suite as a git submodule instead...

One major observation I have is that this test suite is for WSDL 2.0 and strictly speaking soapfish currently only supports (a subset of) WSDL 1.1 - this is something that should probably be addressed, see INFORMATION.md for links to specifications.

I don't really have time to look into RedBaron at the moment myself, but your idea makes sense and we do need an alternative to the One Big Template™ approach. Feel free to experiment!

@ngnpope ngnpope changed the title Seeking Feedback - DO NOT MERGE ! [WIP] Refactor approach to code generation, improve test suite. May 10, 2017
@techdragon
Copy link
Contributor Author

I while this is the WSDL 2 spec, there's no reason Soapfish can't grow to support for that so he tests are helpful. I went looking for equivalent tests for WSDL 1.1 but the situation there was much more unclear. There are several fractured sets of test tools and various options but they primarily seem focused on validating the client server message exchange via a logging proxy middleman approach which makes them much harder to adapt.

I'm likely to add significantly more tests for XSD via the XML schema data types test suite. I'm effectively scavenging for test cases between W3C XML workgroups, WSDL and SOAP based specifications, and the test suites of other WSDL projects like Apache CFX, Apache Axis, gSOAP.

I'm hopeful that RedBaron can help us get the best of both worlds from the simplicity of the templates for initial generation of code, and the robustness of an Abstract Syntax Tree approach that can ensure the correctness of our output without the need to run it back through eval.

@ngnpope
Copy link
Member

ngnpope commented May 10, 2017

Yeah. The eval() stuff is pretty evil. So is the conversion to and fro between code and XML for validation and ordering purposes:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging this pull request may close these issues.

2 participants