parent
09857a579b
commit
3f4e6616a0
|
@ -50,18 +50,19 @@ d = max(\mid x_{1k}-x_{2k} \mid)
|
||||||
$$
|
$$
|
||||||
|
|
||||||
4. 闵可夫斯基距离
|
4. 闵可夫斯基距离
|
||||||
- 当$p=1$时,就是曼哈顿距离
|
- 当 $p=1$ 时,就是曼哈顿距离
|
||||||
- 当$p=2$时,就是欧式距离
|
- 当 $p=2$ 时,就是欧式距离
|
||||||
- 当$p \to \infty$时,就是切比雪夫距离
|
- 当 $p \to \infty$ 时,就是切比雪夫距离
|
||||||
|
|
||||||
$$
|
$$
|
||||||
d = \sqrt[p]{\sum_{k=1}^n \mid x_{1k}-x_{2k} \mid ^p}
|
d = \sqrt[p]{\sum_{k=1}^n \mid x_{1k}-x_{2k} \mid ^p}
|
||||||
$$
|
$$
|
||||||
|
|
||||||
5. 余弦距离
|
5. 余弦距离
|
||||||
$$
|
|
||||||
cos(\theta) = \frac{\sum_{k=1}^n x_{1k}x_{2k}}{\sqrt{\sum_{k=1}^n x_{1k}^2} \sqrt{\sum_{k=1}^n x_{2k}^2}}
|
$$
|
||||||
$$
|
cos(\theta) = \frac{\sum_{k=1}^n x_{1k}x_{2k}}{\sqrt{\sum_{k=1}^n x_{1k}^2} \sqrt{\sum_{k=1}^n x_{2k}^2}}
|
||||||
|
$$
|
||||||
|
|
||||||
### 机器学习的定义和应用领域
|
### 机器学习的定义和应用领域
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue