http://open.sina.com.cn/course/id_1047/

What scientists do is not just collect data and collect facts and stick them in big books,and what's more ,their main job is to talk to each other about what they don't know. They really do care about a kind of ignorance that is called 'thoroughly conscious ignorance'by James Clerk Maxwell,perhaps the greatest physicist between Newton and Einstein.
To be a scientist,it is true that we have to know a lot of stuff,but knowing a lot of stuff doesn't make us a scientist."We need to know a lot of stuff to be a lawyer or an accountant or an electician or a carpenter.But in science, knowing a lot of stuff is not the point.Knowing a lot of stuff is there to help you get to more ignorance.So knowledge is a big subject, but I would say ignorance is a bigger one."

To illustrate scientific process, we tend to use he models of science, such as the puzzle model,the peels of an onion,the iceberg idea and the magic well(or the ripples on a pond).In all of these models,Stuart Firestein believe the last one reflects the nature of scientific work."I think what really happen is science is a model more like the magic well,where no matter how many buckets you take out,there's always another bucket of water to be had, or my particularly favorite one, with the effect and everything, the ripples on a pond."And he also said in his speech that"So if you think of knowledge being this ever-expanding ripple on a pond,the important thing to realize is that our ignorance,the circumference of this knowledge,also grows with knowledge.So the knowledge generates ignorance." (I'd like to remmend you to go 7:35 of speech and read whole wonderful saying.)George Bernard Shaw claims ,at a dinner celebrating Einstein's work,that science just creates more questions than I answers."Science is always wrong.It never solves a problem without creating 10 more." The same idea coming from the philosopher Immanuel Kant is "question propagation",this idea of questions propagating out there.

He also mentioned about the education .

Here is the Transcript.

There is an ancient proverb that says it's very difficult to find a black cat in a dark room, especially when there is no cat. I find this a particularly apt description of science and how science works -- bumbling around in a dark room, bumping into things, trying to figure out what shape this might be, what that might be, there are reports of a cat somewhere around, they may not be reliable, they may be, and so forth and so on.

0:40 Now I know this is different than the way most people think about science. Science, we generally are told, is a very well-ordered mechanism for understanding the world, for gaining facts, for gaining data,that it's rule-based, that scientists use this thing called the scientific method and we've been doing this for 14 generations or so now, and the scientific method is a set of rules for getting hard, cold facts out of the data.

1:06 I'd like to tell you that's not the case. So there's the scientific method, but what's really going on is this. (Laughter)

1:12 [The Scientific Method vs. Farting Around]

1:13 And it's going on kind of like that.

1:16 [... in the dark] (Laughter)

1:17  So what is the difference, then, between the way I believe science is pursued and the way it seems to be perceived? So this difference first came to me in some ways in my dual role at Columbia University,where I'm both a professor and run a laboratory in neuroscience where we try to figure out how the brain works. We do this by studying the sense of smell, the sense of olfaction, and in the laboratory, it's a great pleasure and fascinating work and exciting to work with graduate students and post-docs and think up cool experiments to understand how this sense of smell works and how the brain might be working, and, well, frankly, it's kind of exhilarating.

1:58 But at the same time, it's my responsibility to teach a large course to undergraduates on the brain, and that's a big subject, and it takes quite a while to organize that, and it's quite challenging and it's quite interesting, but I have to say, it's not so exhilarating. So what was the difference? Well, the course I was and am teaching is called Cellular and Molecular Neuroscience - I. (Laughs) It's 25 lectures full of all sorts of facts, it uses this giant book called "Principles of Neural Science" by three famous neuroscientists.This book comes in at 1,414 pages, it weighs a hefty seven and a half pounds. Just to put that in some perspective, that's the weight of two normal human brains.

2:46(Laughter)

2:50 So I began to realize, by the end of this course, that the students maybe were getting the idea that we must know everything there is to know about the brain. That's clearly not true. And they must also have this idea, I suppose, that what scientists do is collect data and collect facts and stick them in these big books. And that's not really the case either. When I go to a meeting, after the meeting day is over and we collect in the bar over a couple of beers with my colleagues, we never talk about what we know. We talk about what we don't know. We talk about what still has to get done, what's so critical to get done in the lab. Indeed, this was, I think, best said by Marie Curie who said that one never notices what has been done but only what remains to be done. This was in a letter to her brother after obtaining her second graduate degree, I should say.

3:38  I have to point out this has always been one of my favorite pictures of Marie Curie, because I am convinced that that glow behind her is not a photographic effect. (Laughter) That's the real thing. It is true that her papers are, to this day, stored in a basement room in the Bibliothèque Française in a concrete room that's lead-lined, and if you're a scholar and you want access to these notebooks, you have to put on a full radiation hazmat suit, so it's pretty scary business.

4:05 Nonetheless, this is what I think we were leaving out of our courses and leaving out of the interaction that we have with the public as scientists, the what-remains-to-be-done. This is the stuff that's exhilarating and interesting. It is, if you will, the ignorance. That's what was missing.

4:21  So I thought, well, maybe I should teach a course on ignorance, something I can finally excel at, perhaps, for example. So I did start teaching this course on ignorance, and it's been quite interesting and I'd like to tell you to go to the website. You can find all sorts of information there. It's wide open. And it's been really quite an interesting time for me to meet up with other scientists who come in and talk about what it is they don't know.

4:45 Now I use this word "ignorance," of course, to be at least in part intentionally provocative, because ignorance has a lot of bad connotations and I clearly don't mean any of those. So I don't mean stupidity, I don't mean a callow indifference to fact or reason or data. The ignorant are clearly unenlightened, unaware, uninformed, and present company today excepted, often occupy elected offices, it seems to me. That's another story, perhaps.

5:12I mean a different kind of ignorance. I mean a kind of ignorance that's less pejorative, a kind of ignorance that comes from a communal gap in our knowledge, something that's just not there to be known or isn't known well enough yet or we can't make predictions from, the kind of ignorance that's maybe best summed up in a statement by James Clerk Maxwell, perhaps the greatest physicist between Newton and Einstein, who said, "Thoroughly conscious ignorance is the prelude to every real advance in science." I think it's a wonderful idea: thoroughly conscious ignorance.

5:41 So that's the kind of ignorance that I want to talk about today, but of course the first thing we have to clear up is what are we going to do with all those facts? So it is true that science piles up at an alarming rate. We all have this sense that science is this mountain of facts, this accumulation model of science, as many have called it, and it seems impregnable, it seems impossible. How can you ever know all of this?And indeed, the scientific literature grows at an alarming rate. In 2006, there were 1.3 million papers published. There's about a two-and-a-half-percent yearly growth rate, and so last year we saw over one and a half million papers being published. Divide that by the number of minutes in a year, and you wind up with three new papers per minute. So I've been up here a little over 10 minutes, I've already lost three papers. I have to get out of here actually. I have to go read.

6:27 So what do we do about this? Well, the fact is that what scientists do about it is a kind of a controlled neglect, if you will. We just don't worry about it, in a way. The facts are important. You have to know a lot of stuff to be a scientist. That's true. But knowing a lot of stuff doesn't make you a scientist. You need to know a lot of stuff to be a lawyer or an accountant or an electrician or a carpenter. But in science, knowing a lot of stuff is not the point. Knowing a lot of stuff is there to help you get to more ignorance. So knowledge is a big subject, but I would say ignorance is a bigger one.

7:05 So this leads us to maybe think about, a little bit about, some of the models of science that we tend to use, and I'd like to disabuse you of some of them. So one of them, a popular one, is that scientists are patiently putting the pieces of a puzzle together to reveal some grand scheme or another. This is clearly not true. For one, with puzzles, the manufacturer has guaranteed that there's a solution. We don't have any such guarantee. Indeed, there are many of us who aren't so sure about the manufacturer.

7:30 (Laughter)

7:33 So I think the puzzle model doesn't work.

7:35 Another popular model is that science is busy unraveling things the way you unravel the peels of an onion. So peel by peel, you take away the layers of the onion to get at some fundamental kernel of truth. I don't think that's the way it works either. Another one, a kind of popular one, is the iceberg idea, that we only see the tip of the iceberg but underneath is where most of the iceberg is hidden. But all of these models are based on the idea of a large body of facts that we can somehow or another get completed.We can chip away at this iceberg and figure out what it is, or we could just wait for it to melt, I suppose, these days, but one way or another we could get to the whole iceberg. Right? Or make it manageable. But I don't think that's the case. I think what really happens in science is a model more like the magic well, where no matter how many buckets you take out, there's always another bucket of water to be had,or my particularly favorite one, with the effect and everything, the ripples on a pond. So if you think of knowledge being this ever-expanding ripple on a pond, the important thing to realize is that our ignorance, the circumference of this knowledge, also grows with knowledge. So the knowledge generates ignorance. This is really well said, I thought, by George Bernard Shaw. This is actually part of a toast that he delivered to celebrate Einstein at a dinner celebrating Einstein's work, in which he claims that science just creates more questions than it answers. ["Science is always wrong. It never solves a problem without creating 10 more."]

8:52 I find that kind of glorious, and I think he's precisely right, plus it's a kind of job security. As it turns out, he kind of cribbed that from the philosopher Immanuel Kant who a hundred years earlier had come up with this idea of question propagation, that every answer begets more questions. I love that term, "question propagation," this idea of questions propagating out there.

9:15 So I'd say the model we want to take is not that we start out kind of ignorant and we get some facts together and then we gain knowledge. It's rather kind of the other way around, really. What do we use this knowledge for? What are we using this collection of facts for? We're using it to make better ignorance, to come up with, if you will, higher-quality ignorance. Because, you know, there's low-quality ignorance and there's high-quality ignorance. It's not all the same. Scientists argue about this all the time.Sometimes we call them bull sessions. Sometimes we call them grant proposals. But nonetheless, it's what the argument is about. It's the ignorance. It's the what we don't know. It's what makes a good question.

9:53 So how do we think about these questions? I'm going to show you a graph that shows up quite a bit on happy hour posters in various science departments. This graph asks the relationship between what you know and how much you know about it. So what you know, you can know anywhere from nothing to everything, of course, and how much you know about it can be anywhere from a little to a lot. So let's put a point on the graph. There's an undergraduate. Doesn't know much but they have a lot of interest.They're interested in almost everything. Now you look at a master's student, a little further along in their education, and you see they know a bit more, but it's been narrowed somewhat. And finally you get your Ph.D., where it turns out you know a tremendous amount about almost nothing. (Laughter) What's really disturbing is the trend line that goes through that because, of course, when it dips below the zero axis, there, it gets into a negative area. That's where you find people like me, I'm afraid.

10:50 So the important thing here is that this can all be changed. This whole view can be changed by just changing the label on the x-axis. So instead of how much you know about it, we could say, "What can you ask about it?" So yes, you do need to know a lot of stuff as a scientist, but the purpose of knowing a lot of stuff is not just to know a lot of stuff. That just makes you a geek, right? Knowing a lot of stuff, the purpose is to be able to ask lots of questions, to be able to frame thoughtful, interesting questions,because that's where the real work is.

11:21 Let me give you a quick idea of a couple of these sorts of questions. I'm a neuroscientist, so how would we come up with a question in neuroscience? Because it's not always quite so straightforward. So, for example, we could say, well what is it that the brain does? Well, one thing the brain does, it moves us around. We walk around on two legs. That seems kind of simple, somehow or another. I mean, virtually everybody over 10 months of age walks around on two legs, right? So that maybe is not that interesting.So instead maybe we want to choose something a little more complicated to look at. How about the visual system? There it is, the visual system. I mean, we love our visual systems. We do all kinds of cool stuff. Indeed, there are over 12,000 neuroscientists who work on the visual system, from the retina to the visual cortex, in an attempt to understand not just the visual system but to also understand how general principles of how the brain might work. But now here's the thing: Our technology has actually been pretty good at replicating what the visual system does. We have TV, we have movies, we have animation, we have photography, we have pattern recognition, all of these sorts of things. They work differently than our visual systems in some cases, but nonetheless we've been pretty good at making a technology work like our visual system. Somehow or another, a hundred years of robotics, you never saw a robot walk on two legs, because robots don't walk on two legs because it's not such an easy thing to do. A hundred years of robotics, and we can't get a robot that can move more than a couple steps one way or the other. You ask them to go up an inclined plane, and they fall over. Turn around, and they fall over. It's a serious problem. So what is it that's the most difficult thing for a brain to do? What ought we to be studying?Perhaps it ought to be walking on two legs, or the motor system. I'll give you an example from my own lab, my own particularly smelly question, since we work on the sense of smell. But here's a diagram of five molecules and sort of a chemical notation. These are just plain old molecules, but if you sniff those molecules up these two little holes in the front of your face, you will have in your mind the distinct impression of a rose. If there's a real rose there, those molecules will be the ones, but even if there's no rose there, you'll have the memory of a molecule. How do we turn molecules into perceptions? What's the process by which that could happen? Here's another example: two very simple molecules, again in this kind of chemical notation. It might be easier to visualize them this way, so the gray circles are carbon atoms, the white ones are hydrogen atoms and the red ones are oxygen atoms. Now these two molecules differ by only one carbon atom and two little hydrogen atoms that ride along with it, and yet one of them, heptyl acetate, has the distinct odor of a pear, and hexyl acetate is unmistakably banana.So there are two really interesting questions here, it seems to me. One is, how can a simple little molecule like that create a perception in your brain that's so clear as a pear or a banana? And secondly, how the hell can we tell the difference between two molecules that differ by a single carbon atom? I mean, that's remarkable to me, clearly the best chemical detector on the face of the planet. And you don't even think about it, do you?

14:23 So this is a favorite quote of mine that takes us back to the ignorance and the idea of questions. I like to quote because I think dead people shouldn't be excluded from the conversation. And I also think it's important to realize that the conversation's been going on for a while, by the way. So Erwin Schrodinger, a great quantum physicist and, I think, philosopher, points out how you have to "abide by ignorance for an indefinite period" of time. And it's this abiding by ignorance that I think we have to learn how to do.This is a tricky thing. This is not such an easy business.

14:52 I guess it comes down to our education system, so I'm going to talk a little bit about ignorance and education, because I think that's where it really has to play out. So for one, let's face it, in the age of Google and Wikipedia, the business model of the university and probably secondary schools is simply going to have to change. We just can't sell facts for a living anymore. They're available with a click of the mouse, or if you want to, you could probably just ask the wall one of these days, wherever they're going to hide the things that tell us all this stuff.

15:19 So what do we have to do? We have to give our students a taste for the boundaries, for what's outside that circumference, for what's outside the facts, what's just beyond the facts.

15:30 How do we do that? Well, one of the problems, of course, turns out to be testing. We currently have an educational system which is very efficient but is very efficient at a rather bad thing. So in second grade, all the kids are interested in science, the girls and the boys. They like to take stuff apart. They have great curiosity. They like to investigate things. They go to science museums. They like to play around. They're in second grade. They're interested. But by 11th or 12th grade, fewer than 10 percent of them have any interest in science whatsoever, let alone a desire to go into science as a career. So we have this remarkably efficient system for beating any interest in science out of everybody's head.

16:16 Is this what we want? I think this comes from what a teacher colleague of mine calls "the bulimic method of education." You know. You can imagine what it is. We just jam a whole bunch of facts down their throats over here and then they puke it up on an exam over here and everybody goes home with no added intellectual heft whatsoever.

16:35This can't possibly continue to go on. So what do we do? Well the geneticists, I have to say, have an interesting maxim they live by. Geneticists always say, you always get what you screen for. And that's meant as a warning. So we always will get what we screen for, and part of what we screen for is in our testing methods. Well, we hear a lot about testing and evaluation, and we have to think carefully when we're testing whether we're evaluating or whether we're weeding, whether we're weeding people out,whether we're making some cut. Evaluation is one thing. You hear a lot about evaluation in the literature these days, in the educational literature, but evaluation really amounts to feedback and it amounts to an opportunity for trial and error. It amounts to a chance to work over a longer period of time with this kind of feedback. That's different than weeding, and usually, I have to tell you, when people talk about evaluation, evaluating students, evaluating teachers, evaluating schools, evaluating programs, that they're really talking about weeding. And that's a bad thing, because then you will get what you select for, which is what we've gotten so far.

17:44 So I'd say what we need is a test that says, "What is x?" and the answers are "I don't know, because no one does," or "What's the question?" Even better. Or, "You know what, I'll look it up, I'll ask someone, I'll phone someone. I'll find out." Because that's what we want people to do, and that's how you evaluate them. And maybe for the advanced placement classes, it could be, "Here's the answer. What's the next question?" That's the one I like in particular.

18:07 So let me end with a quote from William Butler Yeats, who said "Education is not about filling buckets; it is lighting fires."

18:15 So I'd say, let's get out the matches. Thank you.

我看的公开课系列--《TED:对无知的追求》 by stuart firestein的更多相关文章

  1. [Aaronyang紫色博客] 写给自己的WPF4.5-Blend5公开课系列 3 - 再来一发

     我的文章一定要做到对读者负责,否则就是失败的文章  ---------   www.ayjs.net    aaronyang技术分享 深入路径的Blend技巧课,Ay原创,自己琢磨讲解 内容已经迁 ...

  2. [Aaronyang紫色博客] 写给自己的WPF4.5-Blend5公开课系列 2-更进一步

     我的文章一定要做到对读者负责,否则就是失败的文章  ---------   www.ayjs.net    aaronyang技术分享 欢迎大家支持我的力作<[Aaronyang] 写给自己的 ...

  3. [Aaronyang紫色博客] 写给自己的WPF4.5-Blend5公开课系列 1

     我的文章一定要做到对读者负责,否则就是失败的文章  ---------   www.ayjs.net    aaronyang技术分享 欢迎大家支持我的力作<[Aaronyang] 写给自己的 ...

  4. 智捷公开课马上开始了-欢迎大家一起讨论学习-第一系列读《Swift开发指南(修订版) 》看Swift视频教程

    引用: 智捷课堂携手51CTO学院.图灵教育联合举办iOS线上培训就业班系列体验公开课. 分享移动开发.移动设计方向最新,最热,最抢眼技术热点以及设计经验.我们每周将最少举办一次公开课,同时会提前安排 ...

  5. AI研讨会直播:《人工智能开发前沿》实战系列公开课第1期

    报名链接:https://www.slidestalk.com/m/276 活动背景 业务需求.数据.算法.算力等因素,决定人工智能技术走向产业落地面临各种挑战.博客园联合示说网以及产业内人工智能技术 ...

  6. .Net免费公开课视频+资料+源码+经典牛逼 汇总篇【持续更新】

    博主推荐一:WP8.1最经典培训教程 博主点评:经典Windows Phone8.1 Runtime API培训最经典教程,此教程由传智播客蒋坤老师录制的一整套WP8.1入门级视频教程,讲授内容非常广 ...

  7. [置顶] 局部加权回归、最小二乘的概率解释、逻辑斯蒂回归、感知器算法——斯坦福ML公开课笔记3

    转载请注明:http://blog.csdn.net/xinzhangyanxiang/article/details/9113681 最近在看Ng的机器学习公开课,Ng的讲法循循善诱,感觉提高了不少 ...

  8. Coursera公开课笔记: 斯坦福大学机器学习第六课“逻辑回归(Logistic Regression)” 清晰讲解logistic-good!!!!!!

    原文:http://52opencourse.com/125/coursera%E5%85%AC%E5%BC%80%E8%AF%BE%E7%AC%94%E8%AE%B0-%E6%96%AF%E5%9D ...

  9. Andrew Ng 机器学习公开课 - 线性回归

    我的机器学习系列从现在开始将会结合Andrew Ng老师与sklearn的api是实际应用相结合来写了. 吴恩达(1976-,英文名:Andrew Ng),华裔美国人,是斯坦福大学计算机科学系和电子工 ...

随机推荐

  1. js刷新页面和跳转

    javascript返回上一页: 1.返回上一页 history.go(-1); 返回上两个页面 history.go(-2); <a href="javascript:history ...

  2. 学习记录012-NFS

    1.Network file System 主要是通过网络让不同的主机进行通信,构建于ip协议之上的现代文件系统,用来存储共享视频,图片,文件等 2.并发大的时候会有点问题(维护不好会丢数据) 3.N ...

  3. PHP多线程类

    <?php /** * @title: PHP多线程类(Thread) * @version: 1.0 * @author: phper.org.cn < web@phper.org.cn ...

  4. javascript密码强度验证!

    //CharMode函数 //测试某个字符是属于哪一类 function CharMode(iN) { if (iN>=48 && iN <=57) //数字 return ...

  5. flash背景透明兼容ie火狐

    <embed src="1234.swf" quality="high" type="application/x-shockwave-flash ...

  6. Struts2 的验证

    概述 一个健壮的 web 应用程序必须确保用户输入是合法.有效的. Struts2 的输入验证 –基于 XWork Validation Framework 的声明式验证:Struts2 提供了一些基 ...

  7. SPOJ COT2 树上找路径上不同值的个数

    题目大意 给出多个询问u , v , 求出u-v路径上点权值不同的个数 开始做的是COT1,用主席树写过了,理解起来不难 很高兴的跑去做第二道,完全跟普通数组区间求k个不同有很大区别,完全没思路 膜拜 ...

  8. 傅里叶变换:MP3、JPEG和Siri背后的数学

    九年前,当我还坐在学校的物理数学课的课堂里时,我的老师为我们讲授了一种新方法,给我留下了深刻映像.我认为,毫不夸张地说,这是对数学理论发现最广泛的应用.应用的领域包括:量子物理.射电天文学.MP3和J ...

  9. 0xC0000005: 读取位置 0x00000000 时发生访问冲突

    遇见这种问题一般都是空指针,即:指针里没有赋值~ 如果你对null 进行操作就会产生空指针异常 Object obj = new Object(); 你要知道 obj是一个Object指针变量,指向O ...

  10. root运行chrome

    os:centos7 edit file : /usr/bin/google-chrome Add "--user-data-dir" (without the quotes) a ...