Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stack Trace providing too much information when e.g. org.springframework.dao.DuplicateKeyException is thrown #1985

Open
amundsor opened this issue Jan 29, 2025 · 7 comments
Assignees
Labels
status: waiting-for-triage An issue we've not yet triaged

Comments

@amundsor
Copy link

Sorry if the responsible for this issue doesn't belong in this repo, but I don't know any better. 🦆

We are using Spring Data JDBC in our application, and when these kind of exceptions are thrown, the default is to print the entire sql which failed into console. I'm thinking that i want most of the information like db-operation, table, columns, etc, etc, but the actual data could be sensitive and should not appear in logs by default.

Could it be an idea to atleast hide the actual data behind debug/trace or another kind of solution?

@spring-projects-issues spring-projects-issues added the status: waiting-for-triage An issue we've not yet triaged label Jan 29, 2025
@mp911de
Copy link
Member

mp911de commented Jan 29, 2025

Thanks for reaching out. Care to post an example that you're talking about? Feel free to obfuscate sensitive parts.

@amundsor
Copy link
Author

amundsor commented Jan 29, 2025

I am using JdbcAggregateTemplate to do an update or insert.

In the complete stack trace there are actually 3 levelse of causes which prints out all data.
The top level cause is:

org.springframework.data.relational.core.conversion.DbActionExecutionException: Failed to execute BatchWithValue{actions=[Insert{entity=Inntekt(type=*, beskrivelse=null, inntektDetaljer=OkonomiDetaljer(detaljer=[UtbetalingMedKomponent(utbetaling=Utbetaling(brutto=*, netto=*, belop=null, skattetrekk=*, andreTrekk=0.0, utbetalingsdato=*, periodeFom=*, periodeTom=*, mottaker=null, organisasjon=Organisasjon(navn=*, orgnummer=*), tittel=null), tittel=*, komponenter=[Komponent(type=Ordinær, belop=*, satsType=null, satsAntall=0.0, satsBelop=0.0), Komponent(type=*, belop=*, satsType=null, satsAntall=0.0, satsBelop=0.0)])])), propertyPath=inntekter, dependingOn=DbAction.UpdateRoot(entity=Okonomi(soknadId=*, inntekter=[Inntekt(type=*, beskrivelse=null, inntektDetaljer=OkonomiDetaljer(detaljer=[UtbetalingMedKomponent(utbetaling=Utbetaling(brutto=*, netto=*, belop=null, skattetrekk=*, andreTrekk=0.0, utbetalingsdato=*, periodeFom=*, periodeTom=*, mottaker=null, organisasjon=Organisasjon(navn=*, orgnummer=*), tittel=null), tittel=*, komponenter=[Komponent(type=*, belop=*, satsType=null, satsAntall=0.0, satsBelop=0.0), Komponent(type=*, belop=*, satsType=null, satsAntall=0.0, satsBelop=0.0)])]))], utgifter=[], formuer=[], bekreftelser=[Bekreftelse(type=*, tidspunkt=*, verdi=true)], bostotteSaker=[])), idValueSource=NONE, qualifiers={}}], batchValue=NONE}
	at org.springframework.data.jdbc.core.AggregateChangeExecutor.execute(AggregateChangeExecutor.java:118)
	at org.springframework.data.jdbc.core.AggregateChangeExecutor.lambda$executeSave$0(AggregateChangeExecutor.java:61)
	at org.springframework.data.relational.core.conversion.BatchedActions$InsertCombiner.lambda$forEach$2(BatchedActions.java:170)
	at java.util.HashMap.forEach(Unknown Source)
	at org.springframework.data.relational.core.conversion.BatchedActions$InsertCombiner.lambda$forEach$3(BatchedActions.java:170)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(Unknown Source)
	at java.util.stream.SortedOps$SizedRefSortingSink.end(Unknown Source)
	at java.util.stream.AbstractPipeline.copyInto(Unknown Source)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(Unknown Source)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(Unknown Source)
	at java.util.stream.AbstractPipeline.evaluate(Unknown Source)
	at java.util.stream.ReferencePipeline.forEach(Unknown Source)
	at org.springframework.data.relational.core.conversion.BatchedActions$InsertCombiner.forEach(BatchedActions.java:169)
	at org.springframework.data.relational.core.conversion.BatchedActions.forEach(BatchedActions.java:79)
	at org.springframework.data.relational.core.conversion.SaveBatchingAggregateChange.forEachAction(SaveBatchingAggregateChange.java:77)
	at org.springframework.data.jdbc.core.AggregateChangeExecutor.executeSave(AggregateChangeExecutor.java:61)
	at org.springframework.data.jdbc.core.JdbcAggregateTemplate.performSave(JdbcAggregateTemplate.java:499)
	at org.springframework.data.jdbc.core.JdbcAggregateTemplate.update(JdbcAggregateTemplate.java:237)
	at no.nav.sosialhjelp.soknad.v2.config.repository.UpsertRepositoryImpl.save(UpsertRepository.kt:30)
	... 132 frames truncated

which are caused by org.springframework.dao.DuplicateKeyException which also prints out all data,
which are caused by java.sql.BatchUpdateException which also prints out all data,
which are caused by org.postgresql.util.PSQLException.

@mp911de
Copy link
Member

mp911de commented Jan 29, 2025

DuplicateKeyException and below only really contain table and column names, not the actual values as we're using prepared statements. Reporting the entire entity in the exception is unfortunate. We have several reports regarding DbActionExecutionException (#1086, #1956) indicating that throwing this exception wasn't the best idea we had so far.

I think that this ticket will be marked as duplicate of one of these above and we might not throw DbActionExecutionException at all.

@amundsor
Copy link
Author

Caused by: org.springframework.dao.DuplicateKeyException: PreparedStatementCallback; SQL [INSERT INTO "inntekt" ("beskrivelse", "detaljer", "okonomi", "type") VALUES (?, ?, ?, ?)]; Batch entry 0 INSERT INTO "inntekt" ("beskrivelse", "detaljer", "okonomi", "type") VALUES ((*), ('{"*'), ('*'::uuid), ('*')) was aborted: ERROR: duplicate key value violates unique constraint "pk_inntekt" Detail: Key (okonomi, type)=(*, *) already exists. Call getNextException to see other errors in the batch.

This might also contain sensitive data.. Perhaps a table should not have PK containing sensitive data, but in context of our application a primary key containing personal identification could be considered sensitive.

Should the plain sql which failed be printed at all as the default behaviour?

@schauder
Copy link
Contributor

I think the SQL is very important for debugging these problems and therefore definitely should get logged.

@amundsor
Copy link
Author

Indeed it is. But should the actual data which failed be at debug level?
If the default would be sql with masked data, it would still help alot.

@mp911de
Copy link
Member

mp911de commented Jan 30, 2025

Batch entry messages come from the Postgres driver. I didn't know that org.postgresql.core.v3.BatchedQuery is replacing ? placeholders with the actual values.

The assembly of PSQLException is out of our reach as it comes from the driver.
The only two things that come to my mind are:

  1. Setting the logServerErrorDetail=false property when connecting to Postgres. Seems that flag affects also batches, see Undocumented property logServerErrorDetail does not prevent server error leakage i BatchResultHandler pgjdbc/pgjdbc#2147. I suggest you familiarize yourself with the driver documentation at https://jdbc.postgresql.org/documentation/use/ for further details on the driver config.
  2. Alternatively, configuring your own SQLExceptionTranslator. SQLExceptionSubclassTranslator is used by default (unless your application ships with a sql-error-codes.xml file) that consistently constructs a message that is safe to log in your application.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: waiting-for-triage An issue we've not yet triaged
Projects
None yet
Development

No branches or pull requests

4 participants