Machine Learning/cs231n
-
[cs231n_review#Lecture 3-1] loss functionsMachine Learning/cs231n 2023. 6. 18. 17:01
there exists some parameter matrix, W which will take this long column vector representing the image pixels, and convert this and give you 10 numbers giving scores for each of the 10 classes in the case of CIFAR-10. Where we kind of had this interpretation where larger values of those scores, so a larger value for the cat class means the classifier thinks that the cat is more likely for that ima..
-
[cs231n_review#Lecture 2-5] Linear Classifier from viewpoint of the template matching approachMachine Learning/cs231n 2023. 6. 17. 18:40
In linear classification, we're going to take a bit of a different approach from k-nearest neighbor. So, the linear classifier is one of the simplest examples of what we call a parametric model. we usually write as X for our input data, and also a set of parameters, or weights, which is usually called W, also sometimes theta, depending on the literature. So, in the k-nearest neighbor setup there..
-
[cs231n_review#Lecture 2-4] Cross-ValidationMachine Learning/cs231n 2023. 6. 16. 19:21
idea # 4 Cross-validation Where your algorithm is able to see the labels of the training set, but for the validation set, your algorithm doesn't have direct access to the labels. We only use the labels of the validation set to check how well our algorithm is doing. (Cross-Validation can alleviates over-fitting problem) these things like Euclidean distance, or L1 distance, are really not a very g..
-
[cs231n_review#Lecture 2-3] Setting HyperparametersMachine Learning/cs231n 2023. 6. 14. 17:36
idea #1 if we use this strategy we'll always pick K=1, in practice it seems that setting K equals to larger values might cause us to misclassify some of the training data, but, in fact, lead to better performance on points that were not in the training data. we don't care about fitting the training data, we really care about how our classifier, or how our method, will perform on unseen data afte..
-
[cs231n_review#Lecture 2-2] Nearest Neighbor classifierMachine Learning/cs231n 2023. 6. 13. 11:30
you might imagine working on this dataset called CIFAR-10, which is very commonly used in machine learning, as kind of a small test case. the CIFAR-10 dataset gives you 10 different classes, and for each of those 10 categories it provides 50,000 training images, roughly evenly distributed across these 10 categories. And then 10,000 additional testing images that you're supposed to test your algo..
-
[cs231n_review#Lecture 2-1] Image Classification: A core task in Computer VisionMachine Learning/cs231n 2023. 6. 11. 17:36
* The Problem: Semantic Gap And the computer really is representing the image as this gigantic grid of numbers. So, the image might be something like 800 by 600 pixels. And each pixel is represented by three numbers, giving the red, green, and blue values for that pixel. So, to the computer, this is just a gigantic grid of numbers. out of this, like, giant array of thousands, or whatever, very m..