Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A stateless url shortening service using a compression algorithm that is published (both the algorithm and the codebook used for compression) can mitigate almost all this problems:

* Browsers can decrypt the URL on the fly, just move the mouse pointer over the shortened URL and you'll see the full URL before to click.

* The DB can't get lost. There is no DB

* There is no single point of failure, everybody can run the service, the full information is inside the shortened URL.

* If this gets integrated into the major browsers, there is no a round more of lookups and there isn't possibility of DoS if the service(s) are attacked or offline. At max you can't create new shortened URLs. But remember? Everybody can run the same service.

It will not be possible to be as effective as stateful services, the URLs wil be a bit bigger, but the gain in reliability is huge.

Edit: and p.s. seriously SMS are not good for real message exchanging and will almost die in short time. This is not an issue, nobody really need URL shortening.



I don't think a compression algorithm is going to work. Compression rarely works on such short texts and most URLs are already very dense in information per character. Also, URL shorteners have to stay in the URL-safe character set, and the final product has to have a URL format.

I tried gzipping a Google Maps directions URL and then outputting that in base64. Results:

   $ wc -c url*
       217 url
       186 url.gz
       252 url.gz.base64
So the compressed and then base64'ed version is actually longer. And of course it's more opaque. And I haven't even added on some http://some.domain/ at the beginning to make it URL-like.

This doesn't even work in theory, let alone the practical impossibility of getting every http-fetching service to adhere to this scheme.


Hello. Gzip is not the way to go, try 'Smaz' and you will get better results, but I'm going to write a specifically tuned version of Smaz in order to work very well with urls.

I'm doing this work for Redis (another project of mine) but I guess that somebody else can exploit this work in order to build a stateless url shortener service. I hope so at least.


I don't know much about this sort of compression, but there are some pretty long urls out there. Reducing a four-line Google Maps url to one line would be an amazing achievement, but it doesn't seem like it's quite enough for what people do with reduced urls today.

But, I'd be happy to be proven wrong.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: