Information Entropy


Time Limit: 2 Seconds      Memory Limit: 65536 KB      Special Judge


Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.

Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream.
Entropy thus characterizes our uncertainty about our source of information. The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when
it occurs.

Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
We also call it Shannon entropy or information entropy to distinguish from other occurrences of the term, which appears in various parts of physics in different forms.

Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variable X with possible values {x1, x2,
..., xn}
 and probability mass function P(X) as:

H(X)=E(−ln(P(x)))

Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as

H(X)=−∑i=1nP(xi)log b(P(xi))

Where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10. The unit of entropy is bit for b = 2, nat for b = e,
and dit (or digit) for b = 10 respectively.

In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:

0log b(0)=limp→0+plog b(p)

Your task is to calculate the entropy of a finite sample with N values.

Input

There are multiple test cases. The first line of input contains an integer T indicating the number of test cases. For each test case:

The first line contains an integer N (1 <= N <= 100) and a string S. The string S is one of "bit", "nat" or "dit", indicating the unit of entropy.

In the next line, there are N non-negative integers P1P2, .., PNPi means the probability
of the i-th value in percentage and the sum of Pi will be 100.

Output

For each test case, output the entropy in the corresponding unit.

Any solution with a relative or absolute error of at most 10-8 will be accepted.

Sample Input

3
3 bit
25 25 50
7 nat
1 2 4 8 16 32 37
10 dit
10 10 10 10 10 10 10 10 10 10

Sample Output

1.500000000000
1.480810832465
1.000000000000

Author: ZHOU, Yuchen

Source: The 2014 ACM-ICPC Asia Mudanjiang Regional Contest

#include <iostream>
#include <cstdio>
#include <cstring>
#include <algorithm>
#include <cmath> using namespace std; double p[200]; double xxx(int kind,double x)
{
if(kind==1) return log(x);
else if(kind==2) return log2(x);
else return log10(x);
} int main()
{
int T_T;
scanf("%d",&T_T);
while(T_T--)
{
int n; char op[20];
scanf("%d%s",&n,op);
double ans=0.0;
int kind = 1;
if(op[0]=='b') kind=2;
else if(op[0]=='d') kind=3;
for(int i=0;i<n;i++)
{
scanf("%lf",p+i);
if(p[i]==0) continue;
p[i]/=100.;
ans+=-1*p[i]*xxx(kind,p[i]);
}
printf("%.10lf\n",ans);
}
return 0;
}

ZOJ 3827 Information Entropy 水的更多相关文章

  1. ZOJ 3827 Information Entropy 水题

    Information Entropy Time Limit: 1 Sec Memory Limit: 256 MB 题目连接 http://acm.zju.edu.cn/onlinejudge/sh ...

  2. ZOJ 3827 Information Entropy (2014牡丹江区域赛)

    题目链接:ZOJ 3827 Information Entropy 依据题目的公式算吧,那个极限是0 AC代码: #include <stdio.h> #include <strin ...

  3. zoj 3827 Information Entropy 【水题】

    Information Entropy Time Limit: 2 Seconds      Memory Limit: 65536 KB      Special Judge Information ...

  4. 2014 牡丹江现场赛 i题 (zoj 3827 Information Entropy)

    I - Information Entropy Time Limit:2000MS     Memory Limit:65536KB     64bit IO Format:%lld & %l ...

  5. ZOJ 3827 Information Entropy(数学题 牡丹江现场赛)

    题目链接:http://acm.zju.edu.cn/onlinejudge/showProblem.do? problemId=5381 Information Theory is one of t ...

  6. ZOJ3827 ACM-ICPC 2014 亚洲区域赛的比赛现场牡丹江I称号 Information Entropy 水的问题

    Information Entropy Time Limit: 2 Seconds      Memory Limit: 131072 KB      Special Judge Informatio ...

  7. 2014ACM/ICPC亚洲区域赛牡丹江站现场赛-I ( ZOJ 3827 ) Information Entropy

    Information Entropy Time Limit: 2 Seconds      Memory Limit: 65536 KB      Special Judge Information ...

  8. 通俗易懂的信息熵与信息增益(IE, Information Entropy; IG, Information Gain)

    信息熵与信息增益(IE, Information Entropy; IG, Information Gain) 信息增益是机器学习中特征选择的关键指标,而学习信息增益前,需要先了解信息熵和条件熵这两个 ...

  9. information entropy as a measure of the uncertainty in a message while essentially inventing the field of information theory

    https://en.wikipedia.org/wiki/Claude_Shannon In 1948, the promised memorandum appeared as "A Ma ...

随机推荐

  1. SpringMVC异常处理器

    本节内容: 异常处理思路 自定义异常类 自定义异常处理器 异常处理器配置 错误页面 异常测试 springmvc在处理请求过程中出现异常信息交由异常处理器进行处理,自定义异常处理器可以实现一个系统的异 ...

  2. poj 1321 棋盘问题(n行中放任意k行)

    n*n的棋盘摆K的棋子,任意两个棋子不能在同一行同一列 Sample Input 2 1#.  //# 可放.#4 4...#..#..#..#...-1 -1Sample Output 21 # i ...

  3. python全栈开发day34-线程Thread

    一.昨日内容回顾 1. 概念和理论 进程是计算机资源分配最小单位 进程三状态.同步.异步.阻塞.非阻塞 2. 进程的创建 实例化.自建类run,start,join,terminate,daemon等 ...

  4. Nginx 关键字详解

    转自: https://blog.csdn.net/zhangliangzi/article/details/78257593 1.[alias]——别名配置,用于访问文件系统,在匹配到locatio ...

  5. BZOJ4974 八月月赛 Problem D 字符串大师 KMP

    欢迎访问~原文出处——博客园-zhouzhendong 去博客园看该题解 题目传送门 - BZOJ4974 - 八月月赛 Problem D 题意概括 一个串T是S的循环节,当且仅当存在正整数k,使得 ...

  6. python tkinter-按钮.标签.文本框、输入框

    按钮 无功能按钮 Button的text属性显示按钮上的文本 tkinter.Button(form, text='hello button').pack() 无论怎么变幻窗体大小,永远都在窗体的最上 ...

  7. UVa 11059 - Maximum Product 最大乘积【暴力】

    题目链接:https://vjudge.net/contest/210334#problem/B 题目大意:Given a sequence of integers S = {S1, S2, . . ...

  8. Python爬虫之Beautiful Soup解析库的使用(五)

    Python爬虫之Beautiful Soup解析库的使用 Beautiful Soup-介绍 Python第三方库,用于从HTML或XML中提取数据官方:http://www.crummv.com/ ...

  9. P1590 失踪的7

    P1590 失踪的7进制转换的题目,如果把一个10进制的数当成9进制,相当于没有9这个数字,题目失踪了7,但是无所谓.如果当前的大于7,它就跳过了一个数字,向左移动1位. #include<io ...

  10. Flag之2019年立

    今天是2019年1月12日,这是我第一次在一个公众的平台上立flag. 至于为何想立一个flag,应该是因为自己年龄渐长,从儿时读书时代家人对自己的要求就不高,考试可以及格即可,导致了自己养成了比较安 ...