TG Telegram Group Link
Channel: انجمن علمی مغز و شناخت دانشگاه شهید بهشتی
Back to Bottom
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Professor Wulfram Gerstner,
Director of the Laboratory of Computational Neuroscience (LCN) at the EPFL

Title: Eligibility traces and three-factor rules of synaptic plasticity

Abstract: Hebbian plasticity combines two factors: presynaptic activity must occur together with some postsynaptic variable (spikes, voltage deflection, calcium elevation ...). In three-factor learning rules the combination of the two Hebbian factors is not sufficient, but leaves a trace at the synapses (eligibility trace) which decays over a few seconds; only if a third factor (neuromodulator signal) is present, either simultaneously or within a short a delay, the actual change of the synapse via long-term plasticity is triggered. After a review of classic theories and recent evidence of plasticity traces from plasticity experiments in rodents, I will discuss two studies from my own lab: the first one is a modeling study of reward-based learning with spiking neurons using an actor-critic architecture; the second one is a joint theory-experimental study showing evidence for eligibility traces in human behavior and pupillometry. Extensions from reward-based learning to surprise-based learning will be indicated.

⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Professor James L. McClelland,
Co-Director, Center for Mind, Brain, Computation and Technology, Stanford University


Title: Human and Machine Learning: How each has taught us about the other, and what is left to learn

Abstract: In this talk, I will describe work at the interface between human and machine learning. The talk will draw on the effects of brain damage on human learning and memory, the patterns of learning that humans exhibit, and computational models based on artificial neural networks that reveal properties shared by human and artificial neural networks. In the latter part of the talk, we will discuss challenges posed to artificial learning systems by aspects of human learning we still do not fully understand in terms of the underlying neural computations.

⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Professor Hugo Larochelle,
Google Brain

Title: Learning to generalize from few examples with meta-learning

Abstract: A lot of the recent progress on many AI tasks were enabled in part by the availability of large quantities of labeled data for deep learning. Yet, humans are able to learn concepts from as little as a handful of examples. Meta-learning has been a very promising framework for addressing the problem of generalizing from small amounts of data, known as few-shot learning. In this talk, I’ll present an overview of the recent research that has made exciting progress on this topic. I will also share my thoughts on the challenges and research opportunities that remain in few-shot learning, including a proposal for a new benchmark.

⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 اطلاعیه خوابگاه برای شرکت کنندگان در کارگاه و سمپوزیم یادگیری عمیق و شبکه های عصبی

جهت رفاه حال دانشجویان مقیم سایر شهر ها و با هماهنگی های انجام شده با دانشگاه شهید بهشتی، تعداد محدودی خوابگاه برای دانشجویان ثبت نامی کارگاه و سمپوزیم یادگیری عمیق و شبکه های عصبی فراهم شده است. لذا دانشجویانی که قصد اقامت در خوابگاه را دارند قبل از ثبت نام با شماره 09195849138 (امیر حسین هادیان) تماس حاصل فرمایند و یا به آیدی تلگرام @AmirHoseinHadian پیام ارسال نمایند و بعد از هماهنگی خوابگاه در سامانه ثبت نام نمایند.

❗️اولویت ثبت نام خوابگاه با دانشجویانی است که زود تر درخواست دهند و ثبت نام کنند.

🏢 برگزار کننده: پژوهشکده علوم شناختی و مغز دانشگاه شهید بهشتی و آکادمی لوپ

🕒زمان برگزاری کارگاه : شنبه سوم اسفند الی دوشنبه پنجم اسفند

🕒زمان برگزاری سمپوزیم: سه شنبه ششم اسفند الی پنج شنبه هشتم اسفند

🏫مکان: تالار دانشکده برق، دانشگاه شهید بهشتی


⭕️ برای مشاهده جزییات بیشتر و ثبت نام روی این لینک کلیک کنید.

با ما همراه باشید
@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Dr. Timothée Masquelier,
CNRS Researcher (CR1) in Computational Neuroscience

Title: supervised learning in spiking neural networks

Abstract: I will present two recent works on supervised learning in spiking neural networks.
In the first one, we used backpropagation through time. The most commonly used spiking neuron model, the leaky integrate-and-fire neuron, obeys a differential equation which can be approximated using discrete time steps, leading to a recurrent relation for the potential. The firing threshold causes optimization issues, but they can be overcome using a surrogate gradient. We extended previous approaches in two ways. Firstly, we showed that the approach can be used to train convolutional layers. Secondly, we included fast horizontal connections à la Denève: when a neuron N fires, we subtract to the potentials of all the neurons with the same receptive the dot product between their weight vectors and the one of neuron N. Such connections improved the performance.
The second project focuses on SNNs which use at most one spike per neuron per stimulus, and latency coding. We derived a new learning rule for this sort of network, termed S4NN, akin to traditional error backpropagation, yet based on latencies. We show how approximate error gradients can be computed backward in a feedforward network with any number of layers.


⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Dr. Ali Yoonessi,
Tehran University of Medical Sciences

Title: What can visual system neural networks tell us about better agent-based deep learning models? A prospect.

Abstract: Ample amount of evidence suggests that the visual system is optimized to process the environment that we live in. Interactions of several types of neurons during development creates a sophisticated neural network. What are the properties of these biological neural cells or agents that we can use for creating new models of agent-based neural networks?


⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Dr. Mir-Shahram Safari,
Shahid Beheshti University of Medical Sciences (SBMU)

Title: Neurobiology and Neurophysiology of Neural Networks

Abstract: Neural networks in brain made by different cell-types with different morphology, molecular profile and electrophysiological properties that connected together with precise targeting bias. Synaptic connection between specific cell types have specific structural and functional features that make them different. Learning mechanism in brain obey from architecture of neural microcircuits and synaptic features. Inhibitory control of interneurons on different dendritic compartments have an important role in information processing, synaptic plasticity and learning in neural microcircuits. Different organization of interneurons in neural motifs made required control for example by feedback, feedforward or lateral inhibition. How different brain microcircuits involved in processing of information and learning and memory is very important open question in neuroscience. I will review latest updates on this issue in my talk.


⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Dr. Mohammad Ganjtabesh,
University of Tehran

Title: Bio-inspired Learning of Visual Features in Shallow and Deep Spiking Neural Networks

Abstract: To date, various computational models have been proposed to mimic the hierarchical processing of the ventral visual pathway in the cortex, with limited success. In this talk, we show how the association of both biologically inspired network architecture and learning rule significantly improves the models' performance in challenging invariant object recognition problems. In all experiments, we used a feedforward convolutional SNN and a temporal coding scheme where the most strongly activated neurons fire first, while less activated ones fire later, or not at all. We start with a shallow network, in which neurons in the higher trainable layer are equipped with STDP learning rule and they progressively become selective to intermediate complexity visual features appropriate for object recognition. Then, a deeper model comprising several convolutional (trainable with STDP) and pooling layers will be presented, in which, the complexity of the extracted features increased along the hierarchy, from edge detectors in the first layer to object prototypes in the last layer. Finally, we show how reinforcement learning can be used efficiently to train a deep SNN to perform object recognition in natural images without using any external classifier and the superiority of reward-modulated STDP (R-STDP) over the STDP in extracting discriminative visual features will be discussed.

⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Dr. Milad Mozafari,
CerCo, CNRS, France

Title: Reconstructing Natural Scenes from fMRI Patterns using Bi-directional Generative Neural Networks

Abstract: Decoding and reconstructing images from brain imaging data is a research area of high interest. Recent progress in deep generative neural networks has introduced new opportunities to tackle this problem. Here, we employ a recently proposed large-scale bi-directional generative adversarial network, called BigBiGAN, to decode and reconstruct natural scenes from fMRI patterns. BigBiGAN converts images into a 120-dimensional latent space which encodes class and attribute information together, and can also reconstruct images based on their latent vectors. We trained a linear mapping between fMRI data, acquired over images from 150 different categories of ImageNet, and their corresponding BigBiGAN latent vectors. Then, we applied this mapping to the fMRI activity patterns obtained from 50 new test images from 50 unseen categories in order to retrieve their latent vectors, and reconstruct the corresponding images. Pairwise image decoding from the predicted latent vectors was highly accurate (84%). Moreover, qualitative and quantitative assessments revealed that the resulting image reconstructions were visually plausible, successfully captured many attributes of the original images, and had high perceptual similarity with the original content. This method establishes a new state-of-the-art for fMRI-based natural image reconstruction, and can be flexibly updated to take into account any future improvements in generative models of natural scene images.

⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Dr. Mohammad Rostami,
University of Pennsylvania

Title: Learning to Transfer Knowledge Through Embedding Spaces

Abstract: The unprecedented processing demand, posed by the explosion of big data, challenges researchers to design efficient and adaptive machine learning algorithms that do not require persistent retraining and avoid learning redundant information. Inspired from learning techniques of intelligent biological agents, identifying transferable knowledge across learning problems has been a significant research focus to improve machine learning algorithms. In this talk, we address the challenges of knowledge transfer through embedding spaces that capture and store hierarchical knowledge.

In the first part of the talk, we focus on the problem of cross-domain knowledge transfer. We first address zero-shot image classification, where the goal is to identify images from unseen classes using semantic descriptions of these classes. We train two coupled dictionaries which align visual and semantic domains via an intermediate embedding space. We then extend this idea by training deep networks that match data distributions of two visual domains in a shared cross-domain embedding space. Our approach addresses both semi-supervised and unsupervised domain adaptation setting.

In the second part of the talk, we investigate the problem of cross-task knowledge transfer. Here, the goal is to identify relations and similarities of multiple machine learning tasks to improve performance across the tasks. We first address the problem of zero-shot learning in a lifelong machine learning setting, where the goal is to learn tasks with no data using high-level task descriptions. Our idea is to relate high-level task descriptors to the optimal task parameters through an embedding space. We then develop a method to overcome the problem of catastrophic forgetting within continual learning setting of deep neural networks by enforcing the tasks to share the same distribution in the embedding space. We further demonstrate that our model can address the challenges of domain adaptation in the continual learning setting.

We demonstrate that despite major differences, problems within the above learning scenarios can be tackled through learning an intermediate embedding space.

⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Dr. Reza Ghaderi,
Shahid Beheshti University

Title: Solving cognitive science problems with neural networks

Abstract: This talk gives a brief history of an important part of the mathematical problem-solving techniques that are related to their representation branch and then describes it in neural networks and by introducing the different steps of problems in cognitive science, the use of neural networks in their representation will be described. This description is based on the systematic view of two main parts, namely representation, and optimization. It is hoped that this talk without going into details will enable listeners to use neural network tools in research and problem-solving.


⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Prof. Reza Ebrahimpour,
Shahid Rajaee University

Title: Speed accuracy tradeoff explains the neural mechanism of social decision making

Abstract: Confidence could play a vital role in group decision making. Member’s confidence has major impact on the final decision of the group. the neural mechanism of confidence formation and decision making in the isolated situation has been studied extensively in past decades. Computational models were able to successfully explain how confidence forms and how this variable is related to the other behavioral statistics such as accuracy and reaction time. Yet, these questions are still remained to be unanswered in social decision making. Using a multidisciplinary approach, we studied the speed-accuracy tradeoff regime in social decision making to address this gap. Subjects required to decide about motion direction of random dotes while they were paired with computer generated partners. Although, in social decision making, subjects showed increase in confidence and decrease in reaction time, their accuracy remained unchanged. This phenomenon is hardly explainable by computational models in the isolated decision making. Using a modified neural attractor network, we found that confidence of partner could act as a top-down current derived from Prefrontal cortex and toward the decision making area of the brain (Centro-Parietal). The model could not only explain the speed-accuracy tradeoff but could also explain the variation of confidence observed behavioral data. EEG and Eye data also support our computational model where both data suggest that confidence coding would be altered in social situations in a way that our model predict. The finding of this study could enhance our understating regarding confidence formation in the social decision making context.

⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
📣📣📣 Deep Learning and Neural Networks Symposium and Workshop

⚙️ Organized by: Institute for Cognitive and Brain Sciences, Shahid Beheshti University and Loop Academy

👨🏻‍🎓 Speaker introduction

Dr. Soheil Kolouri,
Research Scientist and Principal Investigator at HRL Laboratories, Malibu, California

Title: Deep Generative Modeling via Wasserstein Distances

Abstract: Deep generative models have become a cornerstone of modern machine learning. The celebrated generative adversarial networks (GANs) have notably contributed to the recent success of these models. However, GANs are also known to be notoriously difficult to optimize, and they are often not stable. Probability metrics, on the other hand, have proven themselves as a reliable alternative to adversarial networks, and provide a better geometric understanding of the problem. In this talk, I will focus on Wasserstein (GSW) distances, which emerge from the optimal transportation problem, discuss their limitations, and introduce Generalized Sliced Wasserstein distances as a remedy to alleviate some of these limitations. I will then review various applications of the GSW in deep generative modeling and transfer learning.

⭕️ For more details please see here

Follow us!

@CMPlab
@LoopAcademy

🌐 www.CMPLab.ir
🌐 www.LoopAcademy.ir
🔔 انجمن علمی دانشجویی پژوهشکده علوم شناختی و مغز دانشگاه شهید بهشتی تقدیم می کند:

🔴 کارگاه آنلاین طراحی تسک های شناختی

🔷 مدرس : پریا آب روانی محقق دکتری روانشناسی شناختی ICBS

📅 زمان برگزاری : پنج شنبه ششم شهریور

🕓 ساعت: 16 تا 20:30

☎️ جهت ثبت نام و کسب اطلاعات بیشتر با آیدی زیر در تماس باشید
@Rad_delaram
🔔 انجمن علمی دانشجویی پژوهشکده علوم شناختی و مغز دانشگاه شهید بهشتی تقدیم می کند:

🔴 کارگاه آنلاین تحلیل داده های رفتاری در پایتون

📅 زمان برگزاری : دوشنبه 31 شهریور

🕓 ساعت: 17 تا 20

☎️ جهت ثبت نام و کسب اطلاعات بیشتر با آیدی زیر در تماس باشید
@Rad_delaram
📣 آزمایشگاه روانشناسی ریاضیاتی و محاسباتی، پژوهشکده علوم شناختی و مغز دانشگاه شهید بهشتی با همکاری آکادمی لوپ برگزار میکنند.

مدرسه ۶ روزه روانشناسی ریاضیاتی:
📆تاریخ برگزاری : شنبه ۲ اسفند الی پنج‌شنبه ۷ اسفند
📆مهلت ارسال رزومه و توصیه نامه: چهارشنبه ۱ بهمن

📃 با اعطای گواهی معتبر از سوی پژوهشکده علوم شناختی و مغز دانشگاه شهید بهشتی و آکادمی لوپ.

⭕️ برای مشاهده جزییات بیشتر و ثبت نام روی این لینک کلیک کنید.

با ما همراه باشید
@CMPlab
@LoopAcademy

🌐 www.CMPLab.sbu.ac.ir
🌐 www.LoopAcademy.io
📣 آزمایشگاه روانشناسی ریاضیاتی و محاسباتی پژوهشکده علوم شناختی و مغز دانشگاه شهید بهشتی با همکاری آکادمی لوپ برگزار میکنند

🧠 کارگاه مدلسازی شناختی رفتار انسان
(Cognitive Modeling of Human Behavior)

📅 تاریخ برگزاری: چهارشنبه ۵ آبان از ساعت ۹ الی ۱۷

🖥 این دوره به صورت آنلاین برگزار خواهد شد.

⭕️ برای مشاهده جزئیات بیشتر و ثبت نام روی این لینک کلیک کنید.

ظرفیت این دوره محدود است.

📃 با اعطای گواهی معتبر از سوی پژوهشکده علوم شناختی و مغز دانشگاه شهید بهشتی و آکادمی لوپ


@CMPlab
@LoopAcademy

🌐 www.CMPLab.sbu.ac.ir
🌐 www.LoopAcademy.io
انجمن علمی علوم شناختی دانشگاه شهید بهشتی از کلیه‌ی دانشجویان علاقمند به عضویت و همکاری در حوزه‌های زیر دعوت به عمل می‌آورد:
🧠تولید محتوا
🧠طراحی
🧠مقاله‌نویسی
🧠بحث‌های موضوعی
🧠 آموزش و پژوهش
🧠ژورنال کلاب
و سایر رویدادهای شناختی
📝جهت ثبت نام و عضویت، اطلاعات خود را در لینک زیر وارد نمایید:
https://forms.gle/MzhnLUnJsZ8fy4mw6


فرصت ثبت نام: ۴ الی ۷ شهریور
HTML Embed Code:
2024/06/14 15:58:49
Back to Top