马氏距离有多重定义:

1)可以表示 某一个样本与DataSet的距离。

2)可以表示两个DataSet之间的距离。

1) The Mahalanobis distance of an observation {\displaystyle {\vec {x}}=(x_{1},x_{2},x_{3},\dots ,x_{N})^{T}} from a set of observations with mean {\displaystyle {\vec {\mu }}=(\mu _{1},\mu _{2},\mu _{3},\dots ,\mu _{N})^{T}} and covariance matrix S is defined as:

Intuitive explanation

Consider the problem of estimating the probability that a test point in N-dimensional Euclidean space belongs to a set, where we are given sample points that definitely belong to that set. Our first step would be to find the average or center of mass of the sample points. Intuitively, the closer the point in question is to this center of mass, the more likely it is to belong to the set.

However, we also need to know if the set is spread out over a large range or a small range, so that we can decide whether a given distance from the center is noteworthy or not. The simplistic approach is to estimate the standard deviation of the distances of the sample points from the center of mass. If the distance between the test point and the center of mass is less than one standard deviation, then we might conclude that it is highly probable that the test point belongs to the set. The further away it is, the more likely that the test point should not be classified as belonging to the set.

This intuitive approach can be made quantitative by defining the normalized distance between the test point and the set to be {\displaystyle {x-\mu } \over \sigma }. By plugging this into the normal distribution we can derive the probability of the test point belonging to the set.

The drawback of the above approach was that we assumed that the sample points are distributed about the center of mass in a spherical(圆) manner. Were the distribution to be decidedly non-spherical, for instance ellipsoidal, then we would expect the probability of the test point belonging to the set to depend not only on the distance from the center of mass, but also on the direction. In those directions where the ellipsoid has a short axis the test point must be closer, while in those where the axis is long the test point can be further away from the center.

Putting this on a mathematical basis, the ellipsoid that best represents the set's probability distribution can be estimated by building the covariance matrix of the samples. The Mahalanobis distance is the distance of the test point from the center of mass divided by the width of the ellipsoid in the direction of the test point.

2)Mahalanobis distance can also be defined as a dissimilarity measure between two random vectors {\displaystyle {\underline {x}}} and {\displaystyle {\underline {y}}} of the same distribution with the covariance matrix S:

{\displaystyle d({\vec {x}},{\vec {y}})={\sqrt {({\vec {x}}-{\vec {y}})^{T}S^{-1}({\vec {x}}-{\vec {y}})}}.\,}

If the covariance matrix is the identity matrix, the Mahalanobis distance reduces to the Euclidean distance. If the covariance matrix is diagonal, then the resulting distance measure is called a standardized Euclidean distance:

{\displaystyle d({\vec {x}},{\vec {y}})={\sqrt {\sum _{i=1}^{N}{(x_{i}-y_{i})^{2} \over s_{i}^{2}}}},}

where si is the standard deviation of the xi and yi over the sample set.

References:

http://people.revoledu.com/kardi/tutorial/Similarity/MahalanobisDistance.html

https://en.wikipedia.org/wiki/Mahalanobis_distance

Mahalanobia Distance(马氏距离)的解释的更多相关文章

  1. paper 114:Mahalanobis Distance(马氏距离)

    (from:http://en.wikipedia.org/wiki/Mahalanobis_distance) Mahalanobis distance In statistics, Mahalan ...

  2. Mahalanobis Distance(马氏距离)

    (from:http://en.wikipedia.org/wiki/Mahalanobis_distance) Mahalanobis distance In statistics, Mahalan ...

  3. 马氏距离(Mahalanobis distance)

    马氏距离(Mahalanobis distance)是由印度统计学家马哈拉诺比斯(P. C. Mahalanobis)提出的,表示数据的协方差距离.它是一种有效的计算两个未知样本集的相似度的方法.与欧 ...

  4. MATLAB求马氏距离(Mahalanobis distance)

    MATLAB求马氏距离(Mahalanobis distance) 作者:凯鲁嘎吉 - 博客园 http://www.cnblogs.com/kailugaji/ 1.马氏距离计算公式 d2(xi,  ...

  5. Mahalanobis距离(马氏距离)的“哲学”解释

    讲解教授:赵辉 (FROM : UESTC) 课程:<模式识别> 整理:PO主 基础知识: 假设空间中两点x,y,定义: 欧几里得距离, Mahalanobis距离, 不难发现,如果去掉马 ...

  6. 有关马氏距离和hinge loss的学习记录

    关于度量学习,之前没有看太多相关的文献.不过南京的周老师的一篇NIPS,确实把这个问题剖析得比较清楚. Mahalanobis距离一般表示为d=(x-y)TM(x-y),其中x和y是空间中两个样本点, ...

  7. 基于欧氏距离和马氏距离的异常点检测—matlab实现

    前几天接的一个小项目,基于欧氏距离和马氏距离的异常点检测,已经交接完毕,现在把代码公开. 基于欧式距离的: load data1.txt %导入数据,行为样本,列为特征 X=data1; %赋值给X ...

  8. Python实现的计算马氏距离算法示例

    Python实现的计算马氏距离算法示例 本文实例讲述了Python实现的计算马氏距离算法.分享给大家供大家参考,具体如下: 我给写成函数调用了 python实现马氏距离源代码:     # encod ...

  9. Levenshtein Distance莱文斯坦距离算法来计算字符串的相似度

    Levenshtein Distance莱文斯坦距离定义: 数学上,两个字符串a.b之间的莱文斯坦距离表示为levab(|a|, |b|). levab(i, j) = max(i, j)  如果mi ...

随机推荐

  1. opencv图像处理之gamma变换

    import cv2 import numpy as np img=cv2.imread('4.jpg') def adjust_gamma(image, gamma=1.0): invGamma = ...

  2. AtCoder Beginner Contest 127 解题报告

    传送门 非常遗憾.当天晚上错过这一场.不过感觉也会掉分的吧.后面两题偏结论题,打了的话应该想不出来. A - Ferris Wheel #include <bits/stdc++.h> u ...

  3. CSP-J2019游记&解题报告

    考前一天晚上失眠.......(其实主要不是因为考试的原因) 很幸运,我们学校就是一个考点,本场作战,应该有一点加持吧. 上午在家复习,看到一篇关于PN532模拟小米手环加密卡的文章,于是,,,,,, ...

  4. ES6函数的个人总结

    默认参数: 1. 在 ES5 语法中,为函数形参指定默认值的写法: 写法一: function foo (bar) { bar = bar || 'abc'; console.log(bar) } f ...

  5. 洛谷 [USACO05DEC] 布局 题解

    今天学了差分约束系统, 这是一道板子题. 核心:a[v]>a[u]+d 相当于从u到v连一条长度为d的有向边.由于要判断有环,所以要从0点先跑一遍spfa因为1点不一定能到所有的点. #incl ...

  6. react基本语法及组件

    一.react简介 1.起源:React 起源于 Facebook 的内部项目,用来架设 Instagram 的网站,并于 2013 年 5 月开源. 2.特点: 1.声明式设计 −React采用声明 ...

  7. python基础知识总结大全(转载)

  8. inertia 服务端驱动的spa 开发框架

    inertia 可以让我们开发server 驱动的单页面应用开发,从目前的github代码来看,代码量并不多,相关的文档也还比较少 introducing-inertia-js 这个连接值得看下 参考 ...

  9. vue-cli之路由独立成JS文件之后,如何在路由中获取vuex属性或者设置国际化i18n的当前使用语言

    国际化vue-i18n的使用: import Vue from 'vue'; import VueI18n from 'vue-i18n'; // 引入语言包 import zh from '@/co ...

  10. django.db.migrations.exceptions.InconsistentMigrationHistory: Migration admin.0001_initial is applie

    Traceback (most recent call last): File "manage.py", line 15, in <module> execute_fr ...