CS224n learning part1

This is the introduction lecture of NLP (Netural language processing)
NLP和深度学习入门。

What’s deep learning?

1
Deep learning is a subfield of machine learning.

Most machine learning methods work well because of human-designed representations and input features. 大多数机器学习方法都能很好地工作,因为有了人类设计的表示和输入特征。Machine learning becomes just optimizing weights to best make a final prediction.

Representation learning attempts to automatically learning good features or representations 试图自动学习好的特征或表示

Deep learning algorithms attempt to learn (multiple levels of) representation and an output. 深度学习算法试图学习(多级)表示和输出。

Deep NLP = Deep Learning + NLP
combine ideas and goals of NLP with using representation learning and deep learning methods to solve them.

Several big improvements in recent years in NLP with different

  • levels: speech, words, syntax, semantics.
  • tools: parts of speech, entities, parsing.
  • application: machine translation, sentiment analysis, dialogue agents, question answering.

Conclusion: Representation for all levels? Vectors

INT201 W8

这周换一个方法记录blog,以记录自己想记录的东西为主。


在language的大集合里,有一部分语言被作为regular languages,测试他们是否是regular languages的方法是使用DFA & NFA & RL,pumpling lemma得出了有一部分语言是unregular的,unregular的语言不可以用DFA NFA Regular expression去define。这是就需要Grammar(content free grammar) which可以define regular languages and unregular languages

Content-free Grammar

G = (V, A, S, P)
V is a set of variables.
A is constant, terminates 阿尔法贝塔
S is a variable, state symoble
P is a set of rules (production) eg: x -> a a属于(VUA)^* 类似转移方程?
都是有限集
S –> aSb content-free意味着ab对S没有影响
L(G) = {w|wEA*, s->w}

CAN201 W9

  1. Routing (2) - Distance vector algorithm
  2. Intra-AS routing in the Internet: OSPF
  3. Routing among the ISPs: BGP
  4. The SDN control plane
  5. ICMP
  6. SNMP
阅读更多

CPT203 W8

这周的内容:

  • UML类图
  • 时序图
  • 状态图
  • 活动图

老师讲的ppt太寄了,建议自己学习。

CAN201 W8

Terms of packet in different layers

  • Application layer: message
  • Transport layer: segment
  • Network layer: datagram
  • Link layer: frame

TCP three-way handshake

  • A -> B: SYN (seq = x, ack = 0, L = 0)
  • B -> A: SYN ACK (seq = y, ack = x+1, L = 0)
  • A->B: ACK (seq=x+1,ack=y+1,L=0)

Flow Control vs. Congestion Control

  • Host vs. Router

这周还是继续IP的学习

阅读更多

CAN201 W6

Lecture

记虽然笔记迟了,但是一定会补回来呀!
这周第一部分是TCP congestion control。我有空再写笔记,我们先进入第二环节。也就是网络层 Network layer.


网络层服务

  • 在发送主机和接收主机对之间发送(segment)
  • 在发送端把段封装到数据报(datagram)中
  • 在接收端,将段上交给传输层实体
  • 网络层协议存在于每一个主机和路由器(全部都有)
  • 路由器检查每一个经过他的IP数据报头部
阅读更多

CPT203 W6

This’s the note of System Modeling.

  • Context models
  • Interaction models
  • Structural models
  • Behavioral models
  • Model-driven engineering

这周各种东西要记的也太多了吧。。。

阅读更多

CAN201 W5

This is the note of CAN201 Week5.
We will focus on Transport Layer.
Roadmap

  1. Pipelined communication
  2. TCP: connection-oriented transport
  3. Principles of congestion control
阅读更多

CPT203 W5

This is the note of CPT203 W5.
This week is mainly about Human Aspect and SE Principles.
其实感觉这周主要说的是如何处理团队中的人际关系问题。和如何提高团队的运作效率。

阅读更多