-
Notifications
You must be signed in to change notification settings - Fork 591
Task arguments not parsing correctly #4782
Comments
I'm having the same issue. As I can see in the logs, the double quotes are escaped by Local Task Launcher (did not try with other deployers yet): Task launch (REST API):
Command launched: Note that double quotes are escaped. As a result, job parameters are misinterpreted by Spring Batch. I found no proper way to pass arguments with spaces (such as a file path) to the job. |
When I opened this issue, I was using the Kubernetes Deployer. The above report from @aritzbastida looks like slightly different behavior than I was experiencing and may even be a different issue related to arg parsing. In any case, parsing of arguments could be improved in DeploymentPropertiesUtils and perhaps other layers as well. |
Hi all I have this issue and I think is related but don't know how to solve it. I'm using the spring-cloud-dataflow-server-2.9.4. I'm trying to obtain a JSON, the issue is when the stream is redeployed. When I configure my stream to be depoyed, in which I'm using a processor (transform, script or http-request), in the "expression" atribute I need to set an expression that contains quotes and double quotes (escaped). The expression works properly the first time I set and allows to deploy the stream, but if I undeploy the stream and try again to deploy it, the spring cloud data flow throws state machine exception because the backslashes used to escape the double quotes are removed. I already follow the considerations in the Spaces and Quotes documentation, but I think that it only applies to the streams definition and not to the deployment time. The URL of the spaces and quotes documentarion is: https://docs.spring.io/spring-cloud-dataflow/docs/current-SNAPSHOT/reference/htmlsingle/#shell-white-space And the sample of the type of expression required:
The stream could be as simple as:
The firs time the expression in the transform processor is set looks like as follows:
Deploying correctly the stream. Once the stream is undeployed and try to deploy it again, the espression looks like:
Where the backslashes were removed, causing the state machine exception because of the unescaped double quotes. This behavior prevents to automate the deployment of the streams. Thanks in advance |
@bjafet123 Please open a new issue for this report. Thank you! |
I am having difficulty passing a JSON string as an argument to a dataflow task. The quotes and spaces within the JSON are problematic for the parser. I have narrowed the issue to the following method within the dataflow server: org.springframework.cloud.dataflow.rest.util.DeploymentPropertiesUtils.parseArgumentList(String s, String delimiter)
It appears that I can surround my string with quotes to ensure it stays intact, however there is not a clear way to escape the quotes within (don't see anything in the parser that allows for this).
Interestingly, it will parse JSON with spaces and quotes, but only if it is in an odd position within the argument list.
This argument list parses correctly:
arg2="Argument 2" arg3=val3
This argument list does not (the second argument does not remain intact):
arg1=val1 arg2="Argument 2" arg3=val3
This argument list parses correctly:
arg0=val0 arg1=val1 arg2="Argument 2" arg3=val3
I am seeing multiple issues here:
The text was updated successfully, but these errors were encountered: