Versions impacted by the bug
5.8.0
5.9.1
What went wrong?
Reading larger datasets that have been compressed with the GZIP filter fail to read due to integer overflow in the decode method:
@Override
public byte[] decode(byte[] dataIn) throws IOException {
// len overflows if dataIn.length is bigger than MAX_ARRAY_LEN * 8, then the first operand will be negative.
int len = Math.min(8 * dataIn.length, MAX_ARRAY_LEN);
try (ByteArrayInputStream in = new ByteArrayInputStream(dataIn);
InflaterInputStream iis = new InflaterInputStream(in, new Inflater(), dataIn.length);
// os fails to initialize with a negative buffer size
ByteArrayOutputStream os = new ByteArrayOutputStream(len)) {
IO.copyB(iis, os, IO.default_socket_buffersize);
return os.toByteArray();
}
}
Relevant stack trace
java.lang.IllegalArgumentException: Negative initial size: -1065224600
java.lang.IllegalArgumentException: Negative initial size: -1065224600
at java.base/java.io.ByteArrayOutputStream.<init>(ByteArrayOutputStream.java:78)
at ucar.nc2.filter.Deflate.decode(Deflate.java:75)
at ucar.nc2.internal.iosp.hdf5.H5tiledLayoutBB$DataChunk.getByteBuffer(H5tiledLayoutBB.java:235)
at ucar.nc2.iosp.LayoutBBTiled.hasNext(LayoutBBTiled.java:101)
at ucar.nc2.internal.iosp.hdf5.H5tiledLayoutBB.hasNext(H5tiledLayoutBB.java:143)
at ucar.nc2.iosp.IospHelper.readData(IospHelper.java:364)
at ucar.nc2.iosp.IospHelper.readDataFill(IospHelper.java:292)
at ucar.nc2.internal.iosp.hdf5.H5iospNew.readData(H5iospNew.java:230)
at ucar.nc2.internal.iosp.hdf5.H5iospNew.readData(H5iospNew.java:204)
at ucar.nc2.NetcdfFile.readData(NetcdfFile.java:2122)
at ucar.nc2.Variable.reallyRead(Variable.java:854)
at ucar.nc2.Variable._read(Variable.java:736)
at ucar.nc2.Variable.read(Variable.java:614)
at ucar.nc2.dataset.VariableDS.reallyRead(VariableDS.java:519)
at ucar.nc2.dataset.VariableDS._read(VariableDS.java:471)
at ucar.nc2.Variable.read(Variable.java:614)
Relevant log messages
No response
If you have an example file that you can share, please attach it to this issue.
If so, may we include it in our test datasets to help ensure the bug does not return once fixed?
Note: the test datasets are publicly accessible without restriction.
No
Code of Conduct
Versions impacted by the bug
5.8.0
5.9.1
What went wrong?
Reading larger datasets that have been compressed with the GZIP filter fail to read due to integer overflow in the decode method:
Relevant stack trace
java.lang.IllegalArgumentException: Negative initial size: -1065224600 java.lang.IllegalArgumentException: Negative initial size: -1065224600 at java.base/java.io.ByteArrayOutputStream.<init>(ByteArrayOutputStream.java:78) at ucar.nc2.filter.Deflate.decode(Deflate.java:75) at ucar.nc2.internal.iosp.hdf5.H5tiledLayoutBB$DataChunk.getByteBuffer(H5tiledLayoutBB.java:235) at ucar.nc2.iosp.LayoutBBTiled.hasNext(LayoutBBTiled.java:101) at ucar.nc2.internal.iosp.hdf5.H5tiledLayoutBB.hasNext(H5tiledLayoutBB.java:143) at ucar.nc2.iosp.IospHelper.readData(IospHelper.java:364) at ucar.nc2.iosp.IospHelper.readDataFill(IospHelper.java:292) at ucar.nc2.internal.iosp.hdf5.H5iospNew.readData(H5iospNew.java:230) at ucar.nc2.internal.iosp.hdf5.H5iospNew.readData(H5iospNew.java:204) at ucar.nc2.NetcdfFile.readData(NetcdfFile.java:2122) at ucar.nc2.Variable.reallyRead(Variable.java:854) at ucar.nc2.Variable._read(Variable.java:736) at ucar.nc2.Variable.read(Variable.java:614) at ucar.nc2.dataset.VariableDS.reallyRead(VariableDS.java:519) at ucar.nc2.dataset.VariableDS._read(VariableDS.java:471) at ucar.nc2.Variable.read(Variable.java:614)Relevant log messages
No response
If you have an example file that you can share, please attach it to this issue.
If so, may we include it in our test datasets to help ensure the bug does not return once fixed?
Note: the test datasets are publicly accessible without restriction.
No
Code of Conduct