The practical situation is the following:
An ASP.NET website displays a table, where users can add or delete rows.
Each row has a unique ID.
I am trying to implement a 'permalink' feature, so that users can bookmark a specific set of rows, or send it to colleagues, and so on.
The URL looks like http://example.com/myform.aspx?ROWS=15-50-59-2153-41234-8211
Until here, everything works fine. However, in some cases, there is just too many rows, and the length of the URL is more than the length usual browsers accept.
Therefore, I have tried to compress the resulting string, as follows:
'sIn is a string of row IDs, separated by vbLf.
'"5046" + vbLf + "231" + vbLf + "7836"
Using oOut As New System.IO.MemoryStream
Using oGZip As New System.IO.Compression.DeflateStream(oOut, CompressionMode.Compress)
Using oStreamIn As New System.IO.StreamWriter(oGZip, System.Text.Encoding.UTF8)
sOut = System.Convert.ToBase64String(oOut.ToArray())
This works quite well on very large lists of rows (let's say 0.1% links generated), but for most simple tables, the 'compressed' link is longer than the original:
For example, if I only have one row selected, the input string is
5980, the output is
7b0HYBxJliUmL23Ke39K9UrX4HShCIBgEyTYkEAQ7MGIzeaS7B1pRyMpqyqBymVWZV1mFkDM7Z28995777333nvvvfe6O51OJ/ff/z9cZmQBbPbOStrJniGAqsgfP358Hz8i/se/9x+8//Bg5/8B. I think it is kind of deafeating the purpose...
I could set a length threshold, and not compress if the string if below this size, but it's not very elegant...
Is there more efficient ways to compress strings like this?