Skip to content

Traverse duplicates - POC for automated duplicate checking #2464

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 19 commits into from
Sep 11, 2024

Conversation

chacebot
Copy link
Contributor

@chacebot chacebot commented Sep 5, 2024

How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration

  • TestCICSReceive.java
  • TestCICSAllocate.java

Checklist:

  • Each of my commits contains one meaningful change
  • I have performed rebase of my branch on top of the development
  • I have performed a self-review of my own code
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I have checked my code and corrected any misspellings

@chacebot
Copy link
Contributor Author

chacebot commented Sep 5, 2024

@slavek-kucera Happy to hear your thoughts on this. Checking tokens based on the arbitrary value of <900 is hackey but was looking for an avenue away from all the manual option checking

for (ParseTree entry : ctx.children) {
if (entry.getChildCount() == 0) {
TerminalNode current = (TerminalNode) entry;
if (current.getSymbol().getType() < 897) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assume that 897 is token id? This, in general, does not look like a good idea, the tokens could be reordered for number of reasons and this condition will change its meaning.

I think you'd need to construct a table of tokens you want to check using CICSLexer.tokenname.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not create an efficient lookup table but did implement a fix to iterate over the ruleName arrays to match the token text

harvestResponseHandlers(ctx.cics_handle_response(), contexts);
checkDuplicates(contexts);
checkResponseHandlers(ctx.cics_handle_response());
checkDuplicates(ctx, "ASIS", "BUFFER", "NOTRUNCATE", "LEAVEKB");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a case when one of these keywords triggers warning in one rule, but error in another?
If not, we could have a single static map.
If so, we could still have a handful of static maps.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is all hit or miss depending on the command. I do not think it is safe to assume that the warning severities are uniform across the compiler for different commands with the same duplicate options. With that, i do not see the benefit of creating multiple static maps that mirror what we are passing as string args here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The benefit would be exactly not creating the hashmap everytime the function is called.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually good push for this, did not yet find a special case where duplicate warning keywords conflict behaviors across the commands i've seen so far. So, consolidated into a single static map.

checkResponseHandlers(ctx.cics_handle_response());
Map<String, ErrorSeverity> specialSeverities = new HashMap<>();
specialSeverities.put("NOTRUNCATE", ErrorSeverity.WARNING);
checkDuplicates(ctx, "NOTRUNCATE");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The biggest problem with this approach is that we lose the ability to check the logical groups e.g. "INTO or SET".

Maybe it could be addressed by extending a map I talked about in the constext of checkDuplicateEntries.
It could map token_id to {group_id, severity} (by default group_id is the token text, severity is error).

The Map<Integer, TerminalNode> entries would be replaced just by Set<String> String being the group id e.g. "INTO or SET" or the token text.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@slavek-kucera The logic for groups can still be handled manually with checkIllegalArguments. This might be too much of an effort to automate since we do not have a set standard on creating subgroups and how they are meant to represent the compiler's behavior.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could be an argument to ditch subgroups all together.

@chacebot chacebot force-pushed the traverse-duplicates branch from 2f58930 to cd687fe Compare September 6, 2024 17:18
@chacebot chacebot marked this pull request as ready for review September 9, 2024 12:07
This was referenced Sep 9, 2024
slavek-kucera
slavek-kucera previously approved these changes Sep 10, 2024
@slavek-kucera slavek-kucera dismissed their stale review September 10, 2024 15:49

more changes

@slavek-kucera slavek-kucera merged commit b15376e into eclipse-che4z:development Sep 11, 2024
17 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants