In statistics and applications of statistics, normalization can have a range of meanings.[1] In the simplest cases, normalization of ratings means adjusting values measured on different scales to a notionally common scale, often prior to averaging. In more complicated cases, normalization may refer to more sophisticated adjustments where the intention is to bring the entire probability distributions of adjusted values into alignment. In the case of normalization of scores in educational assessment, there may be an intention to align distributions to a normal distribution. A different approach to normalization of probability distributions is quantile normalization, where the quantiles of the different measures are brought into alignment.

In another usage in statistics, normalization refers to the creation of shifted and scaled versions of statistics, where the intention is that these normalized values allow the comparison of corresponding normalized values for different datasets in a way that eliminates the effects of certain gross influences, as in an anomaly time series. Some types of normalization involve only a rescaling, to arrive at values relative to some size variable. In terms of levels of measurement, such ratios only make sense for ratio measurements (where ratios of measurements are meaningful), not interval measurements (where only distances are meaningful, but not ratios).

In theoretical statistics, parametric normalization can often lead to pivotal quantities – functions whose sampling distribution does not depend on the parameters – and to ancillary statistics – pivotal quantities that can be computed from observations, without knowing parameters.

在统计学和应用统计学中,normalization有着宽泛的意义。最简单的理解,比如评级的标准化,意味着不同尺度上测量的数据,调整为理论上的共同尺度,这通常要先于平均运算。在复杂的案例中,normalization通常也意味着复杂的调整,目的就是要使得调整后的数据的概率分布,保证某种尺度上的一致。举个例子,在教育评估中,不同科目难易不同,不同的学生选择了不同的科目,得了不同的分数,如何评价他们的好坏?要想使不同科目的分数具有科比性,就需要以‘标准分布(normal distribution)’作为比较的基准。与概率分布标准化不同的一种方法,就是‘分位点标准化( quantile normalization)’,也就是使得不同测量方法的分位点保持一致(我估计是不是类似于举重、拳击的轻量级、重量级的分位)。

在统计学的另一个术语中,标准化normalization特指经过平移和缩放后的统计版本,目的是这些标准化的数据使得来源于不同数据集合中的经归一化后,能够互相比较。以这样的方式消除总体影响效果,比如“异常事件序列( anomaly time series)”。某些类型的标准化只包括一个缩放因子,相对于尺度变量,使其达到某个某个量值。根据测量等级,这样的比率只对比率测量有意义(其中,测量的比率才是有意义的),而不是间隔测量(其中,只有距离是有意义的,而不是比率)

在理论统计学中,参数标准化常常可以导致基准量—采样分布函数不依赖于参数;并且产生一些辅助统计—基准量,这些基准量可以从观察数据计算得到,不需要知道具体参数。

Examples[edit]

There are various normalizations in statistics – nondimensional ratios of errors, residuals, means and standard deviations, which are hence scale invariant – some of which may be summarized as follows. Note that in terms of levels of measurement, these ratios only make sense for ratio measurements (where ratios of measurements are meaningful), not interval measurements (where only distances are meaningful, but not ratios). See also Category:Statistical ratios...

在统计学上,有多种不同的标准化:比如无量纲的误差、残差、均值和标准差等的比率。因为是无量纲比率,所以是尺度不变的。某些比率可以概括如下。注意,根据测量等级,这些比率只是对“比率测量(ratio measurement)”有意义,其中的测量比率是有意义的。See also Category:Statistical ratios...

Name Formula Use
Standard score

Normalizing errors when population parameters are known. Works well for populations that are normally distributed

Student's t-statistic Normalizing residuals when population parameters are unknown (estimated).
Studentized residual Normalizing residuals when parameters are estimated, particularly across different data points in regression analysis.
Standardized moment Normalizing moments, using the standard deviation {\displaystyle \sigma } as a measure of scale.
Coefficient of
variation
Normalizing dispersion, using the mean {\displaystyle \mu } as a measure of scale, particularly for positive distribution such as the exponential distribution and Poisson distribution.
Feature scaling

Feature scaling is used to bring all values into the range [0,1].  This can be generalized to restrict the range of values in the dataset between any arbitrary points a and b usings

.

Note that some other ratios, such as the variance-to-mean ratio {\displaystyle \left({\frac {\sigma ^{2}}{\mu }}\right)}, are also done for normalization, but are not nondimensional: the units do not cancel, and thus the ratio has units, and are not scale invariant.

Other types[edit]

Other non-dimensional normalizations that can be used with no assumptions on the distribution include:

  • Assignment of percentiles. This is common on standardized tests. See also quantile normalization.
  • Normalization by adding and/or multiplying by constants so values fall between 0 and 1. This used for probability density functions, with applications in fields such as physical chemistry in assigning probabilities to |ψ|2.

See also[edit]

References[edit]

  1. Jump up^ Dodge, Y (2003) The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9 (entry for normalization of scores)

normalization(统计)的更多相关文章

  1. 归一化方法 Normalization Method

    1. 概要 数据预处理在众多深度学习算法中都起着重要作用,实际情况中,将数据做归一化和白化处理后,很多算法能够发挥最佳效果.然而除非对这些算法有丰富的使用经验,否则预处理的精确参数并非显而易见. 2. ...

  2. 从Bayesian角度浅析Batch Normalization

    前置阅读:http://blog.csdn.net/happynear/article/details/44238541——Batch Norm阅读笔记与实现 前置阅读:http://www.zhih ...

  3. [CS231n-CNN] Training Neural Networks Part 1 : activation functions, weight initialization, gradient flow, batch normalization | babysitting the learning process, hyperparameter optimization

    课程主页:http://cs231n.stanford.edu/   Introduction to neural networks -Training Neural Network ________ ...

  4. 数据标准化/归一化normalization

    http://blog.csdn.net/pipisorry/article/details/52247379 基础知识参考: [均值.方差与协方差矩阵] [矩阵论:向量范数和矩阵范数] 数据的标准化 ...

  5. (转载)深度剖析 | 可微分学习的自适配归一化 (Switchable Normalization)

    深度剖析 | 可微分学习的自适配归一化 (Switchable Normalization) 作者:罗平.任家敏.彭章琳 编写:吴凌云.张瑞茂.邵文琪.王新江 转自:知乎.原论文参考arXiv:180 ...

  6. 图像分类(二)GoogLenet Inception_v2:Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

    Inception V2网络中的代表是加入了BN(Batch Normalization)层,并且使用 2个 3*3卷积替代 1个5*5卷积的改进版,如下图所示: 其特点如下: 学习VGG用2个 3* ...

  7. tensorflow中batch normalization的用法

    网上找了下tensorflow中使用batch normalization的博客,发现写的都不是很好,在此总结下: 1.原理 公式如下: y=γ(x-μ)/σ+β 其中x是输入,y是输出,μ是均值,σ ...

  8. BN(Batch Normalization)

    Batch Nornalization Question? 1.是什么? 2.有什么用? 3.怎么用? paper:<Batch Normalization: Accelerating Deep ...

  9. 单细胞数据初步处理 | drop-seq | QC | 质控 | 正则化 normalization

    比对 The raw Drop-seq data was processed with the standard pipeline (Drop-seq tools version 1.12 from ...

随机推荐

  1. 集合(一)Collection、List、ArrayList和Vector

    一.Collection 集合存放在java.util包中,可以看作是集成好的数据结构,供你调用,十分方便,集合经常拿来和数组对比,其实我觉得没啥可比性,不过还是简单来看看它们的区别: 1.数组长度固 ...

  2. noi.ac NA537 【Graph】

    本来以为过了...然后FST了... 吐槽:nmdGraph为什么不连通... 这题想法其实非常\(na\ddot{\imath}ve\),就是对于一个连通块先钦点一个点为根,颜色是\(1\),考虑到 ...

  3. poj3666/CF714E/hdu5256/BZOJ1367(???) Making the Grade[线性DP+离散化]

    给个$n<=2000$长度数列,可以把每个数改为另一个数代价是两数之差的绝对值.求把它改为单调不增or不减序列最小代价. 话说这题其实是一个结论题..找到结论应该就很好做了呢. 手玩的时候就有感 ...

  4. JavaScript面向对象OOM 2(JavaScript 创建对象的工厂模式和构造函数模式)

      在创建对象的时候,使用对象字面量和 new Object() 构造函数的方式创建一个对象是最简单最方便的方式.但是凡是处于初级阶段的事物都会不可避免的存在一个问题,没有普适性,意思就是说我要为世界 ...

  5. 动软生成器 Liger model生成模板

    <#@ template language="c#" HostSpecific="True" #> <#@ output extension= ...

  6. qt5--表格控件QTableWidget

    需要    #include <QTableWidget>      #include <QTableWidgetItem> #include "win.h" ...

  7. IDEA+SpringBoot+Freemark 构造一个简单的页面

    访问地址 http://localhost:8083/m2detail 1.在Controller中配置 /** * m2detail */ @RequestMapping(value = " ...

  8. js 创建节点 以及 节点属性 删除节点

    case 'copy': var B1 = document.getElementById("B1"); //获得B1下的html文本 var copy_dom = documen ...

  9. Fiona简介

    Fiona是一个python地理空间处理库,类似于OGR

  10. python3.6+selnium3+IE11问题及解决方法

    环境:python3.6+selnium3+IE11+win7 一.输入框输入字符很慢,大概5秒输入一个字符 解决方法:把IEDriverServer.exe替换成32位的 二.用例异常后不继续执行剩 ...