Spiking Neural Networks (SNN)
Dr. Jeff Clark from RedPoint AI will discuss Spiking Neural Networks (SNN). SNN have the potential to have lower latency and be more computationally efficient. Spiking neural networks are biologically inspired model and are considered a hybrid of neuroscience and machine learning, This is an excited area in machine learning.
Tentative talk: Bayesian Optimization for faster results: Bayesian optimization methods are more computationally efficient (e.g. faster) because they choose the next hyperparameters in an informed manner. This is in contrast of doing a grid or random approach or haphazard manor. Evelyn Boettcher will go through an example of how Bayesian optimization works.Dr. Jeff Clark from RedPoint AI will discuss Spiking Neural Networks (SNN). SNN have the potential to have lower latency and be more computationally efficient. Spiking neural networks are biologically inspired model and are considered a hybrid of neuroscience and machine learning, This is an excited area in machine learning.
Tentative talk: Bayesian Optimization for faster results: Bayesian optimization methods are more computationally efficient (e.g. faster) because they choose the next hyperparameters in an informed manner. This is in contrast of doing a grid or random approach or haphazard manor. Evelyn Boettcher will go through an example of how Bayesian optimization works.
Future Topic ideas:
- MLFlow: Brief overview what it is and why you should switch.
- Multi Arm Bandit: Why and how you should be using this to design your experiments.
- Data: Your algorithm is only as good as your data. DVA promises to help you keep track of any changes you made and when to your data. Does it live up to the hype? Is it really git for your data?
- Training Techniques: Have you start using the classroom technique. What training techniques do you utilize.