2014 牡丹江现场赛 i题 (zoj 3827 Information Entropy)
Time Limit:2000MS Memory Limit:65536KB 64bit IO Format:%lld & %llu
Description
Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.
Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream. Entropy thus characterizes our uncertainty about our source of information. The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when it occurs.
Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication". We also call it Shannon entropy or information entropy to distinguish from other occurrences of the term, which appears in various parts of physics in different forms.
Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variable X with possible values {x1, x2, ..., xn} and probability mass function P(X) as:
Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as
Where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10. The unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10 respectively.
In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:
Your task is to calculate the entropy of a finite sample with N values.
Input
There are multiple test cases. The first line of input contains an integer T indicating the number of test cases. For each test case:
The first line contains an integer N (1 <= N <= 100) and a string S. The string S is one of "bit", "nat" or "dit", indicating the unit of entropy.
In the next line, there are N non-negative integers P1, P2, .., PN. Pi means the probability of the i-th value in percentage and the sum of Piwill be 100.
Output
For each test case, output the entropy in the corresponding unit.
Any solution with a relative or absolute error of at most 10-8 will be accepted.
Sample Input
3
3 bit
25 25 50
7 nat
1 2 4 8 16 32 37
10 dit
10 10 10 10 10 10 10 10 10 10
Sample Output
1.500000000000
1.480810832465
1.000000000000 按照题目中所给的第二个公式求出结果,当字符为“bit”时log的底数为2当字符为“nat”时底数为e字符为“dit”时底数为10
注意所给数据中出现0时要把0 排除
注:求log₂X,log10X,lnx 直接调用math头文件中的log2(),log10(),log()即可
#include <iostream>
#include<cstdio>
#include<cmath>
#include<cstring>
#define maxn 110
using namespace std;
int main()
{
int t,n;
double a[maxn];
double sum;
char s[4];
scanf("%d",&t);
while(t--)
{
sum=0;
scanf("%d %s",&n,s);
for(int i=1;i<=n;++i)
scanf("%lf",&a[i]);
if(strcmp(s,"bit")==0)
{
for(int i=1;i<=n;++i)
{
if(a[i]!=0)
sum-=(a[i]*0.01*((log10(a[i]*0.01)/log10(2))));
}
}
else if(strcmp(s,"nat")==0)
{
for(int i=1;i<=n;++i)
{
if(a[i]!=0)
sum-=(a[i]*0.01*log(a[i]*0.01));
}
}
else if(strcmp(s,"dit")==0)
{
for(int i=1;i<=n;++i)
{
if(a[i]!=0)
sum-=(a[i]*0.01*log10(a[i]*0.01));
}
}
printf("%.12lf\n",sum);
}
return 0;
}
2014 牡丹江现场赛 i题 (zoj 3827 Information Entropy)的更多相关文章
- zoj 3827(2014牡丹江现场赛 I题 )
套公式 Sample Input 33 bit25 25 50 //百分数7 nat1 2 4 8 16 32 3710 dit10 10 10 10 10 10 10 10 10 10Sample ...
- zoj 3820(2014牡丹江现场赛B题)
题目链接:http://acm.zju.edu.cn/onlinejudge/showProblem.do?problemId=5374 思路:题目的意思是求树上的两点,使得树上其余的点到其中一个点的 ...
- zoj 3819(2014牡丹江现场赛 A题 )
题意:给出A班和B班的学生成绩,如果bob(A班的)在B班的话,两个班级的平均分都会涨.求bob成绩可能的最大,最小值. A班成绩平均值(不含BOB)>A班成绩平均值(含BOB) &&a ...
- ZOJ 3827 Information Entropy (2014牡丹江区域赛)
题目链接:ZOJ 3827 Information Entropy 依据题目的公式算吧,那个极限是0 AC代码: #include <stdio.h> #include <strin ...
- ZOJ 3827 Information Entropy(数学题 牡丹江现场赛)
题目链接:http://acm.zju.edu.cn/onlinejudge/showProblem.do? problemId=5381 Information Theory is one of t ...
- 【解题报告】牡丹江现场赛之ABDIK ZOJ 3819 3820 3822 3827 3829
那天在机房做的同步赛,比现场赛要慢了一小时开始,直播那边已经可以看到榜了,所以上来就知道A和I是水题,当时机房电脑出了点问题,就慢了好几分钟,12分钟才A掉第一题... A.Average Score ...
- 2014ACM/ICPC亚洲区域赛牡丹江站现场赛-I ( ZOJ 3827 ) Information Entropy
Information Entropy Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge Information ...
- ZOJ 3827 Information Entropy 水题
Information Entropy Time Limit: 1 Sec Memory Limit: 256 MB 题目连接 http://acm.zju.edu.cn/onlinejudge/sh ...
- ZOJ 3827 Information Entropy 水
水 Information Entropy Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge Informati ...
随机推荐
- 关于iOS元旦http,https的规定,官方论坛回应
先贴原文地址:https://forums.developer.apple.com/thread/48979#146140 原文: eskimoAug 2, 2016 4:17 AM(in respo ...
- 一行代码实现headView弹簧拉伸效果
前言 很多app的个人中心上部的headView都实现了弹簧拉伸的效果,即tableView的top并不随着下拉而滑动,而是紧紧的停在屏幕的最上方. 我们今天就分析一下这个效果的实现方式. 分析 关键 ...
- pl/sql编程
body { font-family: "Microsoft YaHei UI","Microsoft YaHei",SimSun,"Segoe UI ...
- 类库探源——System.Delegate
一.MSDN 描述 Delegate 类:表示委托,委托是一种数据结构,它引用静态方法或引用类实例及该类的实例方法.(是不是感觉很像C语言中的函数指针 :) ) 命名空间: System 程序集: ...
- OpenCV Tricks[OpenCV 笔记3]
官方例程 事例程序位于opencv-3.1.0/samples/cpp/ 目录下,可以通过编译整个工程,编译所有的Sample Code 显示当前使用的OpenCV版本 CV_VERSION为标识当前 ...
- Windows7 QT5.6.0(64位)使用mysql(64位)环境搭建详解
1 说明 使用环境为:Windows7 VS2015 QT5.6.0(64位),MYSQL 5.7.13(64位). 网上各种错误.模糊.抽象的资料,配置环境花了半天,痛定思痛,总结出来,方便后来人. ...
- 授权(Authorization)
介绍 除了认证服务,laravel还提供了授权服务,laravel同样提供了一个简单的方式去组织授权的逻辑来控制资源的访问.我们提供了各种各样的方法协助你们组织授权的逻辑,这些都在下面的文档之中. 定 ...
- django基本命令备忘录
1. 新建一个 django project django-admin.py startproject project-name 新建 app python manage.py startapp ap ...
- ASP.NET MVC轻教程 Step By Step 13——页面布局
一般在一个网站中页面会使用相同的结构和元素,如果每个页面都要重复添加这些元素,不仅繁琐更会给我们后期维护带来大麻烦.所以我们采用网页模板之类的技术,将固定不变的元素放入模板,同时留下一些占位符供页面各 ...
- tornado远远不止
大家的回答都有点片面,更多的关注web框架成,其实tornado远远不止这些,且听我慢慢到来1.高性能的网络库,这可以和gevent,twisted,libevent等做对.提供了异步io支持,超时事 ...