Studying Notes in Computer Science

According to here (https://zhuanlan.zhihu.com/p/24774857) and here (https://www.1point3acres.com/bbs/thread-161015-1-1.html) and my senior's suggestions, I began to learn computer science by myself. I am a master student in School of Material Science, JAIST. I can take School of Information Science's lectures freely. Other time I will learn some lectures online. About study resources, please use Google and Github.

In this page, I just take study notes.

Update later

E-books:http://it-ebooks.flygon.net/

and you also can use Gen Lib to get some e-books!

Computer Networks

Lectures: I226E,JAIST // Book: Computer Networking: A Top-Down Approach

I am using these resources:

Machine Learning

Lectures: I239E,JAIST & Machine Learning course, Nation Taiwan University & YouTube // Books: Machine Learning & Deep Learning

I got some resources from here:https://github.com/allmachinelearning/MachineLearning

I am using these resources:

Supervised Learning

Top-down training of Decision Tree

Resursive Partitioning

  • Find "best" test to install at root
  • Split data on root test
  • Repeat until:

  • All nodes are classes

  • No more features to test

Average Entropy

Self-information amount

-I(x) = -\log_2P(x)

Entropy : Expected value of information amount

-H=-\Sigma P(x)\log_2P(x)

Average Entropy

-H'=-\Sigma P(v)\{\Sigma P(x)\log_2P(x)\}

Then, Here we can get

  • H(S) : Entropy of the dataset S
  • H(S|A) be entropy of S splitted by attribute A

Hence, we can calculate Information gain IG(A) which is the measure of the difference in entropy from before to after the set S is split on an attribute A

IG(S,A)=H(S)-\sum_{t\in T}p(t)H(t)=H(S)-H(S|A)

Computation theory

Lectures: I238E,JAIST // Books: Introduction to Theoretical Computer Science

I am using these resources: