: This stands for "Half-Precision Floating Point." By using 16-bit instead of 32-bit (FP32) weights, developers further halved the file size and VRAM requirements, making it possible to run on consumer-grade graphics cards like the RTX 3060 or even 10-series GPUs. The VAE "Mystery"
: This was the standard file format for AI weights at the time, though it was later largely replaced by the safer .safetensors format due to security concerns regarding potential malware in "pickled" checkpoint files. Anything-V3.0-pruned-fp16.ckpt
The specific filename Anything-V3.0-pruned-fp16.ckpt represents the "optimized" version of the model designed for everyday users: : This stands for "Half-Precision Floating Point