Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement sendExpression via executeCommand #97

Open
3 tasks done
CSchoel opened this issue Jul 15, 2021 · 6 comments
Open
3 tasks done

Implement sendExpression via executeCommand #97

CSchoel opened this issue Jul 15, 2021 · 6 comments
Assignees
Labels
enhancement New feature or request

Comments

@CSchoel
Copy link
Contributor

CSchoel commented Jul 15, 2021

  • Introduce a new command sendExpression that accepts a string in the same format as would be required by the OMShell
  • Make sure that all kinds of errors that can occur when a command is sent are handled properly
    • If the command returns a string or other data type, return that data to the client.
    • If getErrorString() does not return an empty string after the command has been sent, report the results back to the user.
    • If the omc-java-api throws any exception, report the error message back to the user.
  • Test that it works at least for the following inputs:
    • loadModel(Modelica, {"3.2.3"})
    • setCommandLineOptions("-d=newInst,nfAPI")
    • setCommandLineOptions("--unitChecking")
    • getSimulationOptions(Modelica.Electrical.Analog.Examples.Rectifier)
    • simulate(Modelica.Electrical.Analog.Examples.Rectifier)
    • cd("/home")
    • cd()
  • For bonus points, find a way to report to the client a list of all the commands in the OpenModelica Scripting API with their documentation (if available)
@manuEbg
Copy link
Contributor

manuEbg commented Jul 19, 2021

  • If the omc-java-api throws any exception, report the error message back to the user.

omc-java-apis sendExpression does not throw any Exceptions.

I created a few testcases for this functionality.

  • Of course sendExpression5 fails because simulation takes different amount off time everytime...

  • sendExpression9 fails because wrong username...

  • im not sure what the problem with sendExpression8 is.. (also fails on my machine)

im quite sure that at least the sendExpression4 Error is because of ModelicaVersion 3.2.3 which gets loaded in a test prior and was not used to produce expected output.

PS.: The sendExpression Links somehow don't jump to the correct line when clicked... RightClick -> copy link works fine

@manuEbg
Copy link
Contributor

manuEbg commented Jul 20, 2021

The Problem with sendExpression4 is not because of different ModelicaStandardLibrary but because of loading any ModelicaStandardLibrary... I will adjust the expected output.
I replaced the expected Output for sendExpression5 with a bad Regex... its ugly but i think it works

@manuEbg
Copy link
Contributor

manuEbg commented Jul 20, 2021

Two Questions remain:

expected: <[<interactive>:1:1-1:18:writable] Error: Class unknownAPIMethod not found in scope <global scope> (looking for a function or record).
    > 
but was: <[<interactive>:1:1-1:0:writable] Error: Class unknownAPIMethod not found in scope <global scope> (looking for a function or record).>
  • Where does the Notification during sendExpression9 come from?
    Notification: Downloaded package index from URL https://libraries.openmodelica.org/index/v1/index.json.

@manuEbg manuEbg added the enhancement New feature or request label Jul 20, 2021
@CSchoel
Copy link
Contributor Author

CSchoel commented Jul 21, 2021

omc-java-apis sendExpression does not throw any Exceptions.

It does not declare any Exceptions in the header, but that does not mean that it wont throw runtime exceptions, those are two different things. If you look at the implementation of sendExpression(), you can clearly see that the author did anticipate runtime exceptions (otherwise the try is meaningless) and even throws an IllegalStateException explicitly.

@CSchoel
Copy link
Contributor Author

CSchoel commented Jul 21, 2021

It was a good idea to implement the test cases. This allows us to discuss test design:

  • I would say that the most important thing about any unit test is that it should be clear what is tested, and ideally the name of the test should already make this clear. There are a few approaches to this, but I actually like to make test names quite verbose. To understand what you are talking about in these comments, I have to look at the code to understand what sendExpression4 (which is called executeCommand4 in the code - I assume to achieve maximum confusion ;P) actually tries to accomplish. If you instead name this test method something like retrievedSimulationOptionsForValidModelMatchesExperimentAnnotation there is no confusion anymore. The name is quite lengthy, but I would argue that the increased clarity justifies the clunky name. If you do not agree, you can shorten the name to something like getSimulationOptionsValid, but then you should put the full explanation in a Javadoc comment.
    Edit: For full disclosure: This is not my own idea. I got it from StackOverflow and followed it for my own projects, because I found the arguments convincing. The main reason to put the explanation in the name is that you do not have to look at the code to know what failed, you can already see it from the test summary.
  • When you know what it is that you want to test, it also becomes easier to focus on the relevant parts of the result data. 100% exact matches are only a good idea for integer arithmetic and low-level String manipulation, everything else involving floats and/or complex strings might need a more flexible approach. For example, sendExpression5 should probably only test that the result contains the string "The simulation finished successfully", as this test only cares about successful simulation and not, for example, simulation performance.
  • sendExpression6 and sendExpression7 reveal another general pitfall during testing: They rely on the order in which the tests are executed. In general, you can never be sure that the order will stay the same and users might even want to just run a single test case out of the suite. Therefore, you have to ensure that all your test cases are completely independent of each other. You can allow some exceptions to this rule as, for example, setting up a new OMC instance for each test would require a large overhead, but there is no harm in, for example, (re-)loading the MSL before the simulate test, or reducing the cd() test to just checking that a valid file system path is returned - independent of what that path might be. As a general rule: Only test the behavior that is apparent from the specifications of the functions that you call. If you only know that cd() has been called, you should not make any assumptions about which path will be returned.

@CSchoel
Copy link
Contributor Author

CSchoel commented Jul 21, 2021

Two Questions remain:

expected: <[<interactive>:1:1-1:18:writable] Error: Class unknownAPIMethod not found in scope <global scope> (looking for a function or record).
    > 
but was: <[<interactive>:1:1-1:0:writable] Error: Class unknownAPIMethod not found in scope <global scope> (looking for a function or record).>

As you can see, the difference lies in the column number where the error is reported. You expect to see the error from line 1 column 1 to line one column 18, which is the end of the string unknownAPIMethod(). Instead, you get a strange invalid range from line 1, column 1 to line one, column 0. You could investigate where this comes from, but I would argue that it is inconsequential for the test. This is not a real file we are talking about, but a code snippet that is sent directly to the OMC. What good does a correct line and column do for debugging in this case? I would therefore suggest that you simply make the test more flexible, only checking for the relevant part such as Class unknownAPIMethod not found.

The downside to this is that with assertTrue(x.contains("...")) you do not see the actual string content in case the test fails. This can be remedied in two ways:

  • Implement your own assertContains() method that throws an AssertionError with a more meaningful error message.
  • Use the more advanced but also more complicated assertThat from Hamcrest as shown in this StackOverflow answer.
  • Where does the Notification during sendExpression9 come from?
    Notification: Downloaded package index from URL https://libraries.openmodelica.org/index/v1/index.json.

The OMC tries to install unknown libraries from a central package index if possible. The "correct" way to do this seems to be to use installLibrary(), but I think loadModel() also has a fallback mechanism where it tries to install a library, when it cannot find it on the Modelica path. The error that the package index could not be parsed is a known issue in the OpenModelica project that we will have to work around for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants