Staging Environment: Content and features may be unstable or change without notice.
Search for packages
Package details: pkg:deb/debian/golang-github-ulikunitz-xz@0.5.6-2?distro=trixie
purl pkg:deb/debian/golang-github-ulikunitz-xz@0.5.6-2?distro=trixie
Next non-vulnerable version 0.5.15-1
Latest non-vulnerable version 0.5.15-1
Risk 3.1
Vulnerabilities affecting this package (1)
Vulnerability Summary Fixed by
VCID-aag6-jhbk-qqd6
Aliases:
CVE-2025-58058
GHSA-jc7w-c686-c4v9
github.com/ulikunitz/xz leaks memory when decoding a corrupted multiple LZMA archives ### Summary It is possible to put data in front of an LZMA-encoded byte stream without detecting the situation while reading the header. This can lead to increased memory consumption because the current implementation allocates the full decoding buffer directly after reading the header. The LZMA header doesn't include a magic number or has a checksum to detect such an issue according to the [specification](https://github.com/jljusten/LZMA-SDK/blob/master/DOC/lzma-specification.txt). Note that the code recognizes the issue later while reading the stream, but at this time the memory allocation has already been done. ### Mitigations The release v0.5.15 includes following mitigations: - The ReaderConfig DictCap field is now interpreted as a limit for the dictionary size. - The default is 2 Gigabytes - 1 byte (2^31-1 bytes). - Users can check with the [Reader.Header] method what the actual values are in their LZMA files and set a smaller limit using ReaderConfig. - The dictionary size will not exceed the larger of the file size and the minimum dictionary size. This is another measure to prevent huge memory allocations for the dictionary. - The code supports stream sizes only up to a pebibyte (1024^5). Note that the original v0.5.14 version had a compiler error for 32 bit platforms, which has been fixed by v0.5.15. ### Methods affected Only software that uses [lzma.NewReader](https://pkg.go.dev/github.com/ulikunitz/xz/lzma#NewReader) or [lzma.ReaderConfig.NewReader](https://pkg.go.dev/github.com/ulikunitz/xz/lzma#ReaderConfig.NewReader) is affected. There is no issue for software using the xz functionality. I thank @GregoryBuligin for his report, which is provided below. ### Summary When unpacking a large number of LZMA archives, even in a single goroutine, if the first byte of the archive file is 0 (a zero byte added to the beginning), an error __writeMatch: distance out of range__ occurs. Memory consumption spikes sharply, and the GC clearly cannot handle this situation. ### Details Judging by the error __writeMatch: distance out of range__, the problems occur in the code around this function. https://github.com/ulikunitz/xz/blob/c8314b8f21e9c5e25b52da07544cac14db277e89/lzma/decoderdict.go#L81 ### PoC Run a function similar to this one in 1 or several goroutines on a multitude of LZMA archives that have a 0 (a zero byte) added to the beginning. ``` const ProjectLocalPath = "some/path" const TmpDir = "tmp" func UnpackLZMA(lzmaFile string) error { file, err := os.Open(lzmaFile) if err != nil { return err } defer file.Close() reader, err := lzma.NewReader(bufio.NewReader(file)) if err != nil { return err } tmpFile, err := os.CreateTemp(TmpDir, TmpLZMAPrefix) if err != nil { return err } defer func() { tmpFile.Close() _ = os.Remove(tmpFile.Name()) }() sha256Hasher := sha256.New() multiWriter := io.MultiWriter(tmpFile, sha256Hasher) if _, err = io.Copy(multiWriter, reader); err != nil { return err } unpackHash := hex.EncodeToString(sha256Hasher.Sum(nil)) unpackDir := filepath.Join( ProjectLocalPath, unpackHash[:2], ) _ = os.MkdirAll(unpackDir, DirPerm) unpackPath := filepath.Join(unpackDir, unpackHash) return os.Rename(tmpFile.Name(), unpackPath) } ``` ### Impact Servers with a small amount of RAM that download and unpack a large number of unverified LZMA archives
0.5.15-1
Affected by 0 other vulnerabilities.
Vulnerabilities fixed by this package (1)
Vulnerability Summary Aliases
VCID-esea-tj2b-h7ey github.com/ulikunitz/xz fixes readUvarint Denial of Service (DoS) ### Impact xz is a compression and decompression library focusing on the xz format completely written in Go. The function readUvarint used to read the xz container format may not terminate a loop provide malicous input. ### Patches The problem has been fixed in release v0.5.8. ### Workarounds Limit the size of the compressed file input to a reasonable size for your use case. ### References The standard library had recently the same issue and got the [CVE-2020-16845](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-16845) allocated. ### For more information If you have any questions or comments about this advisory: * Open an issue in [xz](https://github.com/ulikunitz/xz/issues). CVE-2021-29482
GHSA-25xm-hr59-7c27

Date Actor Action Vulnerability Source VulnerableCode Version
2026-04-16T12:02:18.468695+00:00 Debian Importer Fixing VCID-esea-tj2b-h7ey https://security-tracker.debian.org/tracker/data/json 38.4.0
2026-04-13T08:13:26.922295+00:00 Debian Importer Fixing VCID-esea-tj2b-h7ey https://security-tracker.debian.org/tracker/data/json 38.3.0
2026-04-03T07:25:45.677672+00:00 Debian Importer Affected by VCID-aag6-jhbk-qqd6 https://security-tracker.debian.org/tracker/data/json 38.1.0
2026-04-03T07:25:45.651605+00:00 Debian Importer Fixing VCID-esea-tj2b-h7ey https://security-tracker.debian.org/tracker/data/json 38.1.0