greatConvensions    About    Archive
A Programming/Research Blog by Hiroyuki Vincent Yamazaki

A Database for Neural Networks

All this Neural Network Terminology…

In the scientific field of machine learning and neural networks, method are introduced with fancy names and abbreviations all the times, at least at the time of writing this blog. There is a terminology that you need to get familiar with in order to fully understand the papers, assuming that the papers are sound and that you’ll read them more than once [1]. The same goes with the articles posted on the Internet. In papers, references can be followed to understand these concepts and you should start by reading upon the work by Geoffrey Hinton, Yann LeCun and Andrew Ng which will make you familiar with some of the terminology.

…in a Repo

However, these references are not always presented in articles on the Internet. I have created a repository on GitHub that tries to list some of the most commonly reoccurring terms with short explanations. Useful links and references are also listed here. You will most likely not read all of them but it would be great if it could aid you in your studies!

Link to the GiHub repository

References

[1] Reading Scientific Papers http://web.stanford.edu/~siegelr/readingsci.htm

Started a Blog

This blog will be focused around machine learning to begin with but will in the future include anything that’s related to coding. Keywords such as Deep Neural Networks DNN, Convolutional Neural Networks CNN and Regions with Convolutional neural Networks R-CNN will most likely appear in the initial posts. I also hope to introduce some papers from arXiv, an online database with scientific research papers.

More Machine Learning?

Yes, there are in fact many tutorials and posts out there on the Internet about machine learning. The aim of this blog however is to act as an complement to that data. For instance, we try to focus on areas that others usually skip, such as describing the convolution with mathematical examples in the context of CNN or visualizing networks in an unconventional matter to make you remember the internal structure of the networks more efficiently.

Much talking but let’s see how it goes (talking solely to myself).