And 10 years down when 20 bits is easy, you're going want shrinking it again, or along any interim update cycle. This is going to upset downstream parsers such as web indexers that expect matching fixed length / pattern. or that have to write zero [de]fillers.
ex: [a-z2-7]{16}.onion, we now see subdomains and uppercase patterns posted and resolving beyond this original pre224 spec, where same is hardly widely documented and known. ie: most untrained users think 0-9 is valid, perhaps even some full UTF-8 charset.
Rather than trying to shorten crypto hashes or key encodings for own sake, or for silly human reasons least important of which is memorization belonging in another layer, consider actual buffers in apps, shell input, 72 char file width, etc.
Prefixing for some anti-dos sounds nice but also goes the way of Moore, and you can only soft increase it without hard banning users with old code and onion keys.