zoj 3827 Information Entropy 【水题】
Information Entropy
Time Limit: 2 Seconds
Memory Limit: 65536 KB Special Judge
Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.
Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream. Entropy thus characterizes our uncertainty about our source of information.
The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when it occurs.
Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication". We also call it Shannon entropy or information entropy to distinguish
from other occurrences of the term, which appears in various parts of physics in different forms.
Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variableX with possible values
{x1, x2, ..., xn} and probability mass functionP(X) as:



rev=2.5.3" alt="" style="height:21px; width:6px; vertical-align:-5px; margin-right:0.09em">
rev=2.5.3" alt="" style="height:5px; width:15px; vertical-align:3px; margin-right:0.05em">
rev=2.5.3" alt="" style="height:14px; width:15px; margin-right:-0.02em">
rev=2.5.3" alt="" style="height:21px; width:7px; vertical-align:-5px; margin-right:0.05em">
rev=2.5.3" alt="" style="height:14px; width:6px; margin-right:0.01em">
rev=2.5.3" alt="" style="height:21px; width:6px; vertical-align:-5px; margin-right:0.09em">
rev=2.5.3" alt="" style="height:21px; width:6px; vertical-align:-5px; margin-right:0.09em">
Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as

rev=2.5.3" width="7" height="21" alt="" style="height:21px; width:7px; vertical-align:-5px; margin-right:0.05em">
rev=2.5.3" alt="" style="height:5px; width:15px; vertical-align:3px; margin-right:0.05em">
rev=2.5.3" alt="" style="height:9px; width:6px; margin-right:0.07em">
rev=2.5.3" alt="" style="height:14px; width:6px; margin-right:0.01em">
rev=2.5.3" width="7" height="21" alt="" style="height:21px; width:7px; vertical-align:-5px; margin-right:0.05em">
rev=2.5.3" width="6" height="21" alt="" style="height:21px; width:6px; vertical-align:-5px; margin-right:0.09em">
Where b is the base of the logarithm used. Common values of b are 2, Euler's numbere, and 10. The unit of entropy is
bit for b = 2, nat for b = e, and
dit (or digit) for b = 10 respectively.
In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:



rev=2.5.3" alt="" style="height:13px; width:10px; vertical-align:-4px; margin-right:0.01em">
rev=2.5.3" alt="" style="height:14px; width:9px; margin-right:0.04em">
rev=2.5.3" width="15" height="5" alt="" style="height:5px; width:15px; vertical-align:3px; margin-right:0.05em">
rev=2.5.3" width="6" height="14" alt="" style="height:14px; width:6px; margin-right:0.01em">
rev=2.5.3" alt="" style="height:9px; width:7px; margin-right:0.04em">
rev=2.5.3" alt="" style="height:13px; width:10px; vertical-align:-4px; margin-right:0.01em">
rev=2.5.3" alt="" style="height:1px; width:1px; margin-right:0.24em">
rev=2.5.3" alt="" style="height:9px; width:6px; margin-right:0em">
rev=2.5.3" width="7" height="21" alt="" style="height:21px; width:7px; vertical-align:-5px; margin-right:0.05em">
rev=2.5.3" alt="" style="height:13px; width:11px; vertical-align:-4px; margin-left:-0.03em; margin-right:0em">
Your task is to calculate the entropy of a finite sample with N values.
Input
There are multiple test cases. The first line of input contains an integer
T indicating the number of test cases. For each test case:
The first line contains an integer N (1 <= N <= 100) and a stringS. The string
S is one of "bit", "nat" or "dit", indicating the unit of entropy.
In the next line, there are N non-negative integers P1,P2, ..,
PN. Pi means the probability of thei-th value in percentage and the sum of
Pi will be 100.
Output
For each test case, output the entropy in the corresponding unit.
Any solution with a relative or absolute error of at most 10-8 will be accepted.
Sample Input
3
3 bit
25 25 50
7 nat
1 2 4 8 16 32 37
10 dit
10 10 10 10 10 10 10 10 10 10
Sample Output
1.500000000000
1.480810832465
1.000000000000
题意:给你N个数和一个字符串str。 若str为bit。则计算sigma( - log2a[i])(1 <= i <= N); str为nat时,计算sigma(- loga[i])(1 <= i <= N); str为dit时,计算sigma(- log10a[i])(1 <= i <= N)。
AC代码:
#include <cstdio>
#include <cstring>
#include <cmath>
#include <cstdlib>
#include <vector>
#include <queue>
#include <stack>
#include <algorithm>
#define LL long long
#define INF 0x3f3f3f3f
#define MAXN 1000
#define MAXM 100000
using namespace std;
int main()
{
int t;
int N;
char str[10];
double a[110];
scanf("%d", &t);
while(t--)
{
scanf("%d%s", &N, str);
double sum = 0;
for(int i = 0; i < N; i++)
scanf("%lf", &a[i]), sum += a[i];
double ans = 0;
if(strcmp(str, "bit") == 0)
{
for(int i = 0; i < N; i++)
{
if(a[i] == 0) continue;
ans += -log2(a[i] / sum) * (a[i] / sum);
}
}
else if(strcmp(str, "nat") == 0)
{
for(int i = 0; i < N; i++)
{
if(a[i] == 0) continue;
ans += -log(a[i] / sum) * (a[i] / sum);
}
}
else
{
for(int i = 0; i < N; i++)
{
if(a[i] == 0) continue;
ans += -log10(a[i] / sum) * (a[i] / sum);
}
}
printf("%.12lf\n", ans);
}
return 0;
}
zoj 3827 Information Entropy 【水题】的更多相关文章
- ZOJ 3827 Information Entropy 水题
Information Entropy Time Limit: 1 Sec Memory Limit: 256 MB 题目连接 http://acm.zju.edu.cn/onlinejudge/sh ...
- ZOJ 3827 Information Entropy 水
水 Information Entropy Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge Informati ...
- ZOJ 3827 Information Entropy (2014牡丹江区域赛)
题目链接:ZOJ 3827 Information Entropy 依据题目的公式算吧,那个极限是0 AC代码: #include <stdio.h> #include <strin ...
- 2014 牡丹江现场赛 i题 (zoj 3827 Information Entropy)
I - Information Entropy Time Limit:2000MS Memory Limit:65536KB 64bit IO Format:%lld & %l ...
- ZOJ 3827 Information Entropy(数学题 牡丹江现场赛)
题目链接:http://acm.zju.edu.cn/onlinejudge/showProblem.do? problemId=5381 Information Theory is one of t ...
- ZOJ3827 ACM-ICPC 2014 亚洲区域赛的比赛现场牡丹江I称号 Information Entropy 水的问题
Information Entropy Time Limit: 2 Seconds Memory Limit: 131072 KB Special Judge Informatio ...
- [ACM] ZOJ 3819 Average Score (水题)
Average Score Time Limit: 2 Seconds Memory Limit: 65536 KB Bob is a freshman in Marjar Universi ...
- ZOJ 2679 Old Bill ||ZOJ 2952 Find All M^N Please 两题水题
2679:http://acm.zju.edu.cn/onlinejudge/showProblem.do?problemId=1679 2952:http://acm.zju.edu.cn/onli ...
- 2014ACM/ICPC亚洲区域赛牡丹江站现场赛-I ( ZOJ 3827 ) Information Entropy
Information Entropy Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge Information ...
随机推荐
- SqlServer高版本数据库数据备份到低版本数据库上
想要将Sqlserver2014高版本备份的数据还原到低版本SqlServer2012上去,但是这在SqlServer中是没法直接还原数据库的,通过以下方法可以顺利还原. 通过高版本生成sql脚本在低 ...
- MSSQL—列记录合并成一行
在项目开发中,有时会碰到将列记录合并为一行的情况,例如根据地区将人员姓名合并,或根据拼音首字母合并城市等,下面就以根据地区将人员姓名合并为例,详细讲一下合并的方法. 首先,先建一个表,并添加一些数据, ...
- 牛客网 暑期ACM多校训练营(第一场)A.Monotonic Matrix-矩阵转化为格子路径的非降路径计数,Lindström-Gessel-Viennot引理-组合数学
牛客网暑期ACM多校训练营(第一场) A.Monotonic Matrix 这个题就是给你一个n*m的矩阵,往里面填{0,1,2}这三种数,要求是Ai,j⩽Ai+1,j,Ai,j⩽Ai,j+1 ,问你 ...
- UVALive(LA) 3644 X-Plosives (并查集)
题意: 有一些简单化合物,每个化合物都由两种元素组成的,你是一个装箱工人.从实验员那里按照顺序把一些简单化合物装到车上,但这里存在安全隐患:如果车上存在K个简单化合物,正好包含K种元素,那么他们就会组 ...
- python 人工智论
https://www.zhihu.com/question/21395276 基于python深度学习库DeepPy的实现:GitHub - andersbll/neural_artistic_st ...
- android intent 跳转
转自:http://blog.sina.com.cn/s/blog_7309444701014u2d.html 一.不需要返回值的跳转 Intent intent=new Intent(); inte ...
- WIN10 64位 JDK的安装
因为电脑系统换掉,重装系统,重新配置了一下环境,安装JDK,现记录一下过程,以便下次查询使用. 官网下载JDK,地址:http://www.oracle.com/technetwork/java/ja ...
- django 用model来简化form
django里面的model和form其实有很多地方有相同之处,django本身也支持用model来简化form 一般情况下,我们的form是这样的 from django import forms ...
- ECSHOP生成缩略图模糊
原因是因为ECSHOP生成缩略图时,用到的函数 imagejpeg() 没有设置质量参数.注释:质量参数为可选项,范围从 0(最差质量,文件更小)到 100(最佳质量,文件最大).如果没有设置质量参 ...
- python获取输入参数
python获取输入参数 学习了:https://www.cnblogs.com/angelatian/p/5832448.html import sys模块: len(sys.argv)参数个数 s ...