med-mastodon.com is one of the many independent Mastodon servers you can use to participate in the fediverse.
Medical community on Mastodon

Administered by:

Server stats:

419
active users

#compression

0 posts0 participants0 posts today

This is Internet Gold.

> "... it is important to note that the compression algorithm used by lzip only discards the unimportant data. And if it was unimportant before, what makes it so important now? Huh? In fact, many users may find that compressing their entire file system and then restoring it will be a good way to learn what is truly important."

web.archive.org/web/2001060804

web.archive.orgLzip lossy compression

Brand new PEP by @emmatyping to add Zstandard to the standard library:
peps.python.org/pep-0784/

Will it make it in to 3.14 before the feature freeze on 2025-05-06? It'll be close but it's possible!

The PEP also suggests namespacing the other compression libraries lzma, bz2 and zlib, with a 10-year deprecation for the old names.

Join the discussion to give your support, suggestions or feedback:

discuss.python.org/t/pep-784-a

Python PEPs
Python Enhancement Proposals (PEPs)PEP 784 – Adding Zstandard to the standard library | peps.python.orgZstandard is a widely adopted, mature, and highly efficient compression standard. This PEP proposes adding a new module to the Python standard library containing a Python wrapper around Meta’s zstd library, the default implementation. Additionally, to a...
#PEP#PEP784#zstd

New blog post where I explain how to minify your Godot export size, turning it from the default 93MB to a tiny 6.4MB file.
Some sacrifices were made, but this should serve as a guide on which options you can tune and how effective each one is!

popcar.bearblog.dev/how-to-min

Popcar's BlogHow to Minify Godot's Build Size (93MB --> 6.4MB exe)A quick guide on how to dramatically reduce Godot's export sizes for your game/app!

BOOK LAUNCH — How to make books more sustainable?

→ How to make books more sustainable? Inspired by the image compression on its solar-powered website, Low-tech Magazine squeezed the article catalog of their three-volume book series into just one book. Compressing the content — an editorial and design choice — produces a larger reduction in resource use than printing on recycled paper could ever do.

→ During this book launch event, we present our "Compressed Book Edition", followed by a public discussion on sustainability in the book publishing business. We also have a number of copies for sale during the event.
18:45 Doors open
19:00 Book launch
19:30 Q&A
19:45 Book sale / snacks & drinks / networking / low-tech showcase ;)

📌 Barcelona — Akasha HUB, Carrer de la Verneda 19 (El Clot)

📝 sign-up on meetup! (meetup.com/akashabarcelona/eve)
🗯 poster by our intern Hugo Lopez

Have you ever thought about how amazing modern #video #compression is?

When #QuickTime came out in 1991, the video codecs didn't save you much space, and were very primitive. You could see the (temporally) repeating blocks very obviously, and they didn't have a lot of finesse.

Cinepak was the first one that had some real compression muscle, then Intel Indeo and others.

I've got a 113-second, 720x1600 video that SHOULD take up 21GB of data raw, but easily compresses down to 16MB, and I can hardly tell the difference.

That's 1360:1 savings!

[Archive — 2018] Une pinte de compression

On cause de compression dite « destructive », où on sacrifie un peu la fidélité aux données pour gagner en place ! 🗜️ Une hérésie pour les détracteurs du MP3 (qui n'ont en général jamais procédé à un test à l'aveugle avec le FLAC 👀)

▶️ Lire cette BD : grisebouille.net/une-pinte-de-
📗 Le livre best of : editions.ptilouk.net/gb10ans
❤️ Soutien : ptilouk.net/#soutien

Does anyone know the compression algorithm that was used in Connectix RAM Doubler which was available for Classic MacOS in the 1990s – or, this would be even better, was the software ever reverse engineered or open sourced?

Since Mavericks (10.9), modern Mach-based OS X/macOS uses the WKdm algorithm for memory compression, so it would be interesting to see if this was already used back then.

en.wikipedia.org/wiki/Virtual_

en.wikipedia.orgVirtual memory compression - Wikipedia

In 2022, I had an idea that could decrease the size of all newly-published npm packages by about 5%, and it was completely backwards compatible. This would have improved performance and reduced storage costs. It seemed like such a good idea at first... evanhahn.com/my-failed-attempt

evanhahn.com · My failed attempt to shrink all npm packages by 5%It seemed like a great idea at first.

”LLM’s are storages of millions of vector programs, that are formed by passive exposure to human output.”

No. There are so many things wrong in this!

-Not programs because they don’t execute anything
-If you give an #LLM a mathematically recursive task, it inferences in exact same time regardless of complexity
-It’s more like #compression than programs, really
-And ”passive exposure”??!! Wtf are these people smoking?!!

Tells more about the psychologies of people building #ai 😣