med-mastodon.com is one of the many independent Mastodon servers you can use to participate in the fediverse.
Medical community on Mastodon

Administered by:

Server stats:

408
active users

#feedforward

0 posts0 participants0 posts today

#Zoomposium with Dr. #Patrick #Krauß: Building instructions for #artificial #consciousness

Transferring the various stages of #Damasio's #theory of consciousness 1:1 into concrete #schematics for #deeplearning. To this end, strategies such as #feedforward connections, #recurrent #connections in the form of #reinforcement learning and #unsupervised #learning are used to simulate the #biological #processes of the #neuronal #networks.

More at: philosophies.de/index.php/2023

or: youtu.be/rXamzyoggCo

#Zoomposium with Dr. #Patrick #Krauß: Building instructions for #artificial #consciousness

Transferring the various stages of #Damasio's #theory of consciousness 1:1 into concrete #schematics for #deeplearning. To this end, strategies such as #feedforward connections, #recurrent #connections in the form of #reinforcement learning and #unsupervised #learning are used to simulate the #biological #processes of the #neuronal #networks.

More at: philosophies.de/index.php/2023

or: youtu.be/rXamzyoggCo

Continued thread

Addendum 10

1 Wide Feedforward All You Need
arxiv.org/abs/2309.01826

* 2 non-embedding components in transformer architecture: attention; feed forward network
* attention captures interdependencies betw. words regardless of position
* FFN non-linearly transforms ea. input token independently
* FFN (sig. fract. parameters) highly redundant
* modest drop in accuracy removing FFN on decoder layers & sharing single FFN across encoder

Absorbing Phase Transitions in Artificial Deep Neural Networks
arxiv.org/abs/2307.02284

To summarize, we believe that the this work places the order-to-chaos transition in the initialized artificial deep neural networks in the broader context of absorbing phase transitions, & serves as the first step toward the systematic comparison between natural/biological & artificial neural networks.
...

arXiv.orgAbsorbing Phase Transitions in Artificial Deep Neural NetworksTheoretical understanding of the behavior of infinitely-wide neural networks has been rapidly developed for various architectures due to the celebrated mean-field theory. However, there is a lack of a clear, intuitive framework for extending our understanding to finite networks that are of more practical and realistic importance. In the present contribution, we demonstrate that the behavior of properly initialized neural networks can be understood in terms of universal critical phenomena in absorbing phase transitions. More specifically, we study the order-to-chaos transition in the fully-connected feedforward neural networks and the convolutional ones to show that (i) there is a well-defined transition from the ordered state to the chaotics state even for the finite networks, and (ii) difference in architecture is reflected in that of the universality class of the transition. Remarkably, the finite-size scaling can also be successfully applied, indicating that intuitive phenomenological argument could lead us to semi-quantitative description of the signal propagation dynamics.

Colleagues get surprised when I say I enjoy “marking” and I always think that is because their focus is wrong; I see it as “giving feedback and feedforward” rather than just marking or grading. This blog has many excellent tips on providing effective #feedback and #feedforward - despite the clunky name. I think seeing the process for what it really is could help lecturers get more out of it. #SoTL

facultyfocus.com/articles/educ

Faculty Focus | Higher Ed Teaching & LearningTen Tips for More Efficient and Effective GradingEffective grading does not have to take inordinate amounts of time, nor do faculty need to sacrifice quality for timely grading.