ZOJ 3827 Information Entropy 水
水
Information Entropy
Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge
Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.
Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream.
Entropy thus characterizes our uncertainty about our source of information. The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when
it occurs.
Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
We also call it Shannon entropy or information entropy to distinguish from other occurrences of the term, which appears in various parts of physics in different forms.
Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variable X with possible values {x1, x2,
..., xn} and probability mass function P(X) as:
Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as
Where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10. The unit of entropy is bit for b = 2, nat for b = e,
and dit (or digit) for b = 10 respectively.
In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:
Your task is to calculate the entropy of a finite sample with N values.
Input
There are multiple test cases. The first line of input contains an integer T indicating the number of test cases. For each test case:
The first line contains an integer N (1 <= N <= 100) and a string S. The string S is one of "bit", "nat" or "dit", indicating the unit of entropy.
In the next line, there are N non-negative integers P1, P2, .., PN. Pi means the probability
of the i-th value in percentage and the sum of Pi will be 100.
Output
For each test case, output the entropy in the corresponding unit.
Any solution with a relative or absolute error of at most 10-8 will be accepted.
Sample Input
3
3 bit
25 25 50
7 nat
1 2 4 8 16 32 37
10 dit
10 10 10 10 10 10 10 10 10 10
Sample Output
1.500000000000
1.480810832465
1.000000000000
Author: ZHOU, Yuchen
Source: The 2014 ACM-ICPC Asia Mudanjiang Regional Contest
#include <iostream>
#include <cstdio>
#include <cstring>
#include <algorithm>
#include <cmath> using namespace std; double p[200]; double xxx(int kind,double x)
{
if(kind==1) return log(x);
else if(kind==2) return log2(x);
else return log10(x);
} int main()
{
int T_T;
scanf("%d",&T_T);
while(T_T--)
{
int n; char op[20];
scanf("%d%s",&n,op);
double ans=0.0;
int kind = 1;
if(op[0]=='b') kind=2;
else if(op[0]=='d') kind=3;
for(int i=0;i<n;i++)
{
scanf("%lf",p+i);
if(p[i]==0) continue;
p[i]/=100.;
ans+=-1*p[i]*xxx(kind,p[i]);
}
printf("%.10lf\n",ans);
}
return 0;
}
ZOJ 3827 Information Entropy 水的更多相关文章
- ZOJ 3827 Information Entropy 水题
Information Entropy Time Limit: 1 Sec Memory Limit: 256 MB 题目连接 http://acm.zju.edu.cn/onlinejudge/sh ...
- ZOJ 3827 Information Entropy (2014牡丹江区域赛)
题目链接:ZOJ 3827 Information Entropy 依据题目的公式算吧,那个极限是0 AC代码: #include <stdio.h> #include <strin ...
- zoj 3827 Information Entropy 【水题】
Information Entropy Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge Information ...
- 2014 牡丹江现场赛 i题 (zoj 3827 Information Entropy)
I - Information Entropy Time Limit:2000MS Memory Limit:65536KB 64bit IO Format:%lld & %l ...
- ZOJ 3827 Information Entropy(数学题 牡丹江现场赛)
题目链接:http://acm.zju.edu.cn/onlinejudge/showProblem.do? problemId=5381 Information Theory is one of t ...
- ZOJ3827 ACM-ICPC 2014 亚洲区域赛的比赛现场牡丹江I称号 Information Entropy 水的问题
Information Entropy Time Limit: 2 Seconds Memory Limit: 131072 KB Special Judge Informatio ...
- 2014ACM/ICPC亚洲区域赛牡丹江站现场赛-I ( ZOJ 3827 ) Information Entropy
Information Entropy Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge Information ...
- 通俗易懂的信息熵与信息增益(IE, Information Entropy; IG, Information Gain)
信息熵与信息增益(IE, Information Entropy; IG, Information Gain) 信息增益是机器学习中特征选择的关键指标,而学习信息增益前,需要先了解信息熵和条件熵这两个 ...
- information entropy as a measure of the uncertainty in a message while essentially inventing the field of information theory
https://en.wikipedia.org/wiki/Claude_Shannon In 1948, the promised memorandum appeared as "A Ma ...
随机推荐
- ERP商品类型管理相关业务处理(三十五)
根据类型编号获取父类名称 -- ============================================= CREATE FUNCTION [dbo].[FN_getParentTyp ...
- POJ 1328 Radar Installation【贪心】
POJ 1328 题意: 将一条海岸线看成X轴,X轴上面是大海,海上有若干岛屿,给出雷达的覆盖半径和岛屿的位置,要求在海岸线上建雷达,在雷达能够覆盖全部岛屿情况下,求雷达的最少使用量. 分析: 贪心法 ...
- kubernetes 部署 traefik 以及kubernetes dashborad
前言 本来打算通过 traefik 来实现 kubernetes dashborad 的服务访问,可是在配置过程中始终报错.最后无奈只能通过nodeport来实现kubernetes dashbora ...
- 【Java】 剑指offer(64) 求1+2+…+n
本文参考自<剑指offer>一书,代码采用Java语言. 更多:<剑指Offer>Java实现合集 题目 求1+2+…+n,要求不能使用乘除法.for.while.if ...
- 017 在SecureCRT中安装rz小工具
1.安装yum 2.上传本地的文件进虚拟机 3.注意点 只是属于SecureCRT的命令,同时,在上传的位置是现在所在的位置 4.测试
- Selenium 获取文本信息方法+select(定位)
1.通过先定位到具体的元素然后通过text方法获取文本信息,如获取控件名称等 driver.find_element_by_xpath("//div[/h1").text 2.直接 ...
- VS Code 配置 C/C++ 环境(转)
写作原因 微软的 VSCode 一直以来为人诟病的一个问题就是对于 C/C++ 工程的编译以及调试支持度有限,配置起来比较复杂,但是 vscode-cpptools 团队经过一段时间的 bug 修 ...
- CPU个数、CPU核心数、CPU线程数
CPU个数.CPU核心数.CPU线程数 我们在选购电脑的时候,CPU是一个需要考虑到核心因素,因为它决定了电脑的性能等级.CPU从早期的单核,发展到现在的双核,多核.CPU除了核心数之外,还有线程数之 ...
- Java的split()用法
特殊情况有 * ^ : | . \ 一.单个符号作为分隔符 String address="上海\上海市|闵行区\吴中路"; String[] splitAddress=addr ...
- BZOJ 4805: 欧拉函数求和 杜教筛
https://www.lydsy.com/JudgeOnline/problem.php?id=4805 给出一个数字N,求sigma(phi(i)),1<=i<=N https://b ...