-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Add decompression size limit to prevent decompression bomb DoS #3625
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Add decompression size limit to prevent decompression bomb DoS #3625
Conversation
…b DoS Add maximum decompression size limit in DeflateCodec to prevent OutOfMemoryError when processing maliciously crafted Avro files with high compression ratios (decompression bombs). The limit defaults to 200MB and can be configured via system property: org.apache.avro.limits.decompress.maxLength
|
This is also a quick fix. I see some other modules have the same pattern. |
|
|
||
| @Override | ||
| public ByteBuffer decompress(ByteBuffer data) throws IOException { | ||
| long maxLength = getMaxDecompressLength(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is no need to call this method on every decompress().
You can read it once in a static {...} block and reuse it.
| try { | ||
| return Long.parseLong(prop); | ||
| } catch (NumberFormatException e) { | ||
| // Use default |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This probably should be logged as a WARNING.
| String prop = System.getProperty(MAX_DECOMPRESS_LENGTH_PROPERTY); | ||
| if (prop != null) { | ||
| try { | ||
| return Long.parseLong(prop); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will also accept a negative and 0 as values which are not very sensible.
Probably these should be reported earlier here ?!
lang/java/avro/src/main/java/org/apache/avro/file/DeflateCodec.java
Outdated
Show resolved
Hide resolved
lang/java/avro/src/main/java/org/apache/avro/file/DeflateCodec.java
Outdated
Show resolved
Hide resolved
….java Thanks! Co-authored-by: Martin Grigorov <[email protected]>
….java Co-authored-by: Martin Grigorov <[email protected]>
Decompression Bomb in Avro Java Codec Layer causes OutOfMemoryError
Summary
All Avro Java compression codecs (Deflate, Zstandard, XZ, BZip2, Snappy) decompress data without any size limit, allowing an attacker to craft a small Avro file (~50KB) that expands to an extremely large size (~50MB+), causing
OutOfMemoryErrorand crashing the JVM.Root Cause
The codec implementations decompress data into unbounded
ByteArrayOutputStreamwithout checking the output size:Vulnerable Code (DeflateCodec.java:83)
PoC
Trigger file
A crafted
poc.avrofile (49KB) that decompresses to 50MB:{"type":"record","name":"Payload","fields":[{"name":"data","type":"bytes"}]}How to generate poc.avro
# Generate poc.avro (needs large heap to create) javac -cp avro-1.13.0.jar CreatePoC.java java -Xmx256m -cp .:avro-1.13.0.jar:jackson-core-2.x.jar:jackson-databind-2.x.jar CreatePoCTrigger Method 1: Official avro-tools CLI
Output:
Other affected avro-tools commands:
cat,count,getmeta,getschema,concat,recodecTrigger Method 2: Fuzzer (oss-fuzz / Jazzer)
Build and run:
Impact
DataFileReader,DataFileStream,avro-toolsSuggested Fix
Add maximum decompression size limit in
DeflateCodec.java:Note: Other codecs (
ZstandardCodec,XZCodec,BZip2Codec,SnappyCodec) have the same issue. We can discuss the fix for those in follow-up.