med-mastodon.com is one of the many independent Mastodon servers you can use to participate in the fediverse.
Medical community on Mastodon

Administered by:

Server stats:

410
active users

#probabilistic

0 posts0 participants0 posts today

Probabilistic weather forecasting with machine learning: “Weather forecasts are fundamentally uncertain, so predicting the range of probable weather scenarios is crucial for important decisions, from warning the public about hazardous weather to planning renewable energy use.

Traditionally, #WeatherForecasts have been based on numerical weather prediction (NWP), which relies on physics-based simulations of the atmosphere. Recent advances in #MachineLearning (#ML)-based weather prediction (MLWP) have produced ML-based models with less forecast error than single NWP simulations. However, these advances have focused primarily on single, deterministic forecasts that fail to represent uncertainty and estimate risk. Overall, MLWP has remained less accurate and reliable than state-of-the-art NWP ensemble forecasts.

Here we introduce GenCast, a #probabilistic weather model with greater skill and speed than the top operational medium-range weather forecast in the world, ENS, the ensemble forecast of the European Centre for Medium-Range Weather Forecasts.”

#science / #weather <nature.com/articles/s41586-024>

@ScienceCommunicator

Right - so 2 basic ways to look at it:

Each specialty branch comes up with their own perspectives, priorities/hierarchies, and jargon. With few exceptions, they stay in their own lanes and do not mix outside their field/subfield. Their writings are stand-alone, rarely intended to connect back to the greater human knowledge base. They'll tell you exactly what "wet" or "sound" is, based entirely on their tiny-desk world-view.

The other way would be a general model of vibration, perhaps based on quantitative info rather than qualitative. It would incorporate all modes of vibratory phenomena, regardless of mediums, velocities, or other characteristics that were fleshed out by some subfield as critical for their particular context & definitions.

Frequency (Hz) actually would be a good quantitative baseline for this, if it hadn't been exorcised from #quantum mechanics.

The "old QM", as espoused by the founders and historic supporters for the split in Physics, used a semi-classical approach that was later abandoned. Later, Pi was added to E = hf as convenience for some aspects, but frequency generally falls away when you work with quantum states ala Schrödinger. #Time is just an input to QM, and is the #Newtonian, absolute kind. The concepts of frequency & wavelength seem to get in the way of the probabilistic formalism.

IMO, the #probabilistic formalism then gets in the way of a #relativistic completion to quantum theory being developed, though in reality, there is no limit to the imaginative supplemental mathematic epicycles that can be added while still spitting out the same expected answers.

Continued thread

Which brings us to political economy – of course.

Because this turn to defense contracting that the large tech companies have been making is a key part of their revenue model.

Work by Tech Inquiry recently revealed that
the five largest US military contracts to major tech firms between 2019 and 2022
have contract ceilings of $53 billion.

And those are the ones we know of.

Due to the combination of military classification and corporate secrecy
– transparency
– let alone accountability
– is very hard to come by.

The use of #probabilistic techniques to determine who is worthy of death
– wherever they’re used
– is, to me, the most chilling example of the serious dangers of the current centralized AI industry ecosystem,
and of the very material risks of believing the bombastic claims of intelligence and accuracy that are used to market these inaccurate systems.

And to justify carnage under the banner of computational sophistication.

As UN Secretary General Antonio Gutiérrez put it,
“machines that have the power and the discretion to take human lives are politically unacceptable,
are morally repugnant,
and should be banned by international law.”

It’s because of this that I join the
German Forum of Computer Scientists for Peace and Social Responsibility
in demanding that
“practices of targeted killing with supporting systems be outlawed as war crimes.”

Particularly given the very real possibility of a more authoritarian government in the US, where these companies are homed.

A place where the right wing in the country has already broadcast plans to
bring the two major tech regulators in the US
–the Federal Trade Commission and the Federal Communications Commission
–under direct presidential control in the future.

Where four of the top five social media platforms are housed, alongside cloud giants that currently control 70% of the global cloud market.

And where a federal abortion ban is on the right wing agenda,
accompanied by ongoing campaigns of book banning and censorship of LGBTQ resources and expression
already shaping legislation at the state level.

(6/8)

Finally, I managed to write another blog post. In the post I am essentially talking about works (mostly) by Alessandro Rudi and coauthors on #tractable #probabilistic models such as #mixtures with #negative parameters.

Check out: trappmartin.github.io/website/

I will likely go into details of some of the aspects of this new family of models in follow up posts.

Let me know what you think about the blog post. Too high level or just right? Or any other feedback?

Almost surelyMixtures with Negative WeightsIt looks like I did not quite manage to stick to my “one post per month” idea.
Replied in thread

@darioringach This preprint is an excellent read on the #adaptation of the #firing rate of primary #visual #cortex neurons - showing a #powerlaw scaling between the ratio of #probabilities and the ratio of observed firing rates, and also relative invariance of directional scatter (though I am puzzled by the use of cosine similarity instead of more standard distances between distributions...).

An interesting result is that the log firing rate is linearly proportional to the negative surprise of a stimulus, which will certainly bring a smile to proponents of #probabilistic representations in the brain or #predictive processing...

The study extends these observations to more ecological distributions, but - as far as I understand it - still samples individual orientations. One can predict that this would extend to showing textures with different levels of anisotropy (as defined by the tuple $(n_x, n_y)$ in the Ringach, 2002 paper, for example), creating a stimulus set ${ s_i }$ that better covers the space of visual features.

The Sources Of Sea-Level Changes In The Mediterranean Sea Since 1960
--
doi.org/10.1029/2022JC019061 <-- shared paper
--
KEY POINTS:
• The average rate of sea-level rise in the Mediterranean Sea increased from −0.3 mm yr−1 in the period 1960–1989 to 3.6 mm yr−1 in 2000–2018
• Sterodynamic and land-mass changes both have made comparable contributions to the Mediterranean sea-level rise since 2000
• Since 2000, sea level has been rising faster in the Adriatic, Aegean, and Levantine Seas than anywhere else in the Mediterranean Sea…”
--
#GIS #spatial #mapping #Mediterranean #climatechange #SLR #sealevelrise #model #modeling #remotesensing #history #change #trends #satellite #altimetry #tideguages #gauges #hydrography #hydrospatial #spatialanalysis #probabilistic #Baysean #glacier #melting

Hello World!

I'm a Prof. of #ComputerScience at VRAIN/UPV (València, Spain), mainly working on (explainable, symbolic) artificial #intelligence #AI #XAI, (#probabilistic) #logic #programming, term #rewriting, #causality, #concurrency, programming #languages, #reversible computing, program #verification, and #debugging. 


I plan to use this account mostly for scientific matters, but not only. I'm also quite interested in #photography, #sciencefiction, #travel, #movies, #series, etc, etc.