Back

重听《成像世界的奇境》

21.11.22 锁链

  • 香农-奈奎斯特采样定律
  • 测量:测不准原理
    • 高速成像:
      • STEAM:10e-10 second 时间分辨率
      • STAMP:10e-12 second
      • 最先进:10e-15 second 接近电子转移过程的时长!

information in imaging

  • 香农:硕士论文:信息学开山
    • 做一些无聊的事?还是有自己独特的思考?

yes and no question How many questions (y/n) must you ask to ensure the color of the ball you have?

  1. 8 red balls?
  2. 4 red & 2 blue & 1 black & 1 white?
  3. 2 red & 2 blue & 2 black & 2 white?

香农信息熵

傅里叶变换的信息损失?

  • 取傅里叶变换的过程中,可以有位深
  • 二维傅里叶变换,可以通过幅度反演相位
  • 取傅里叶变换再取反变换?相似度有多高?
    • SSMS
  • 傅里叶变换过后低频信息密集,高频信息稀疏。

JPEG压缩:压缩低频信息,让步给高频信息

Signal Sparsity 图像中不是每一个频率都要进行采样!可以抽出傅里叶变换中影响比较大的信息量。

  • Compressive sensing
    • 通过采样稀疏的键值信号,尽可能好地还原信号

21.11.29 Law of Imaging: Measurement and statistics

噪声

Photon Noise

光子数量越多,相对噪声数量越来越少。

光子随机到达,符合泊松分布$P(X=k) = \frac{\lambda^k\cdot e^{-\lambda}}{k!}$,误差与光子数量的根号成正比。

但是,噪声如果与信号产生共振,可以使得信号的阈值降低。Stochastic resonance

photons: bunched? antibunched? –> bunched!. 探测到/不探测到光子的概率符合二项式分布。

所有基本粒子,按照自旋数区分:

  1. 整数自旋:费米子,满足泡利不相容原理,不能同时出现
  2. $\frac{1}{2}$自旋:玻色子,不满足不相容原理,两离子可以同时出现在同一位置,符合波色-爱因斯坦分布。

光子,若$hv \leq kT$,则波动性更强,反之粒子性更强。

如何使随机光变得更均匀? 加一“挡板”,收到光子就短时屏蔽掉后来的光子。Sub-Possion Light光子噪声更低,测到的相对噪声远低于普通光源。此技术应用于引力波探测,减少激光束的噪声误差。

DOI: 10.1063/1.881246

Signal to Noise Ratio SNR

对于光:$SNR = \frac{N_S}{\sqrt{N_S}}$(不考虑sensor noise)

光处于粒子态与波动态时的信噪比不同,因此探测量也不同:光:光子个数,波:幅度。

Information Capacity of Imagers

香农定理:$C = \frac{B}{2} \log_2(1+\frac{S}{N})$,对于每个信道。

空间带宽积:$SBP = \frac{A}{G^2}$

所以可以推出相机最大成像信息量:$Capacity = \frac{A}{G^2} R \frac{B}{2} \log_2(1+\frac{S}{N})$

Information Processing Limit

Information and Energy

全互联网数据:175ZB。如此大量信息使得人工智能算法算力需求急速发展,数据分析用电量在急速发展。

但是,摩尔定律的潜能在耗尽,人类发电量的总量也达到瓶颈。

Kardashev Scale

  • 文明分级:能量的使用能力。

Information Processing

Von-Neumann architecture

储存墙瓶颈:为了存储数据消耗了大量能量,消耗了大量时间。读、取均是。例如识别Cifar10图像时访问能量是计算能量的10倍以上

由此可以推算出算力能量效率的极限。

512KB 需要 1.8GFLOPS

Big Data Explosion. 无法应对如此大量的信息。我们必须提升能效。

Ways to overcome

  1. 器件级存算一体神经网络架构。能量潜力~1000TOPS/W
  2. 探索类人眼新形态神经网络结构,充分发挥存算一体器件的优势

Limit for Computing

Computing:映射?

刘慈欣《三体》:人列计算机?

Turing Computers

  • input -> compute unit -> output

Computing unit:

  • 人脑:神经网络:概率
  • 电脑:冯诺依曼架构:逻辑

Power Consumption

  • input
  • output

Maxwell Demon:信息换能量

Szilard Engine(1921): Storing or erasing 1 bit of information needs at least $kT \ln 2 $ Joule energy.

  • Optical Computing
    • Power efficiency could be very low!
    • 直接操作光子,输入图像输出状态
  • Quantum Computing
    • The power efficiency used computing can be negligible.

Noesis

The chains are made of rubber instead of steel. Can human beings break the chains and find the way to Noesis?

Question and Exploration

Mission of human:

  • To quest what is known that is unknown.

智能 Intelligence

光阴似箭

Time of Arrow

  • Entropy as time of arrow
  • Maxwell Demon
  • Infinite monkey theorem
    • As long as it is infinite, anything could be typed.

Gene Epoch

life as a turing machine: if nature happens a nurture a “turing machine”, say mRNA

Foreseeing is of ultimate importance!

Correlation is the base of seeking advantages and avoiding disadvantages. Base on that, intelligence can bring about reacting scheme.

Origin of genes: mRNA and DNA can be now be produced in lab.

Remembering the causal correlations is the key aspect of wisdom.

Memory is the mother of all wisdom.

Memory capacity for lives in dark ages: the capacity of gene of GB quantity.

we are surviving machines – robot vehicles

life is nothing but endless reincarnation of genes.

Neuron Epoch

The advent of eyes –the mass growth of information obtained. We need more capacity for processing information.

The advent of brain.

Neuron in brains: human have on average 86 Billion neurons.

Neurons:

  • Dendrites
  • axon

input –> threshold –> output

each neuron has on average 7000 synaptic connections. Thus human has approximately 10^15 connections.

There are 26 distinguishable synaptic strengths, corresponding to 4.7 bits of information at each connection.

Thus human neuron network memorizes 10^15 * 4.7 bits of weight.

The biggest difference between humans are our connections.

Connectome

Individual Memory: up tp 200Mb long term memory.

Life is no longer big robots of genes.

Every life has its independent surviving tricks, knowledge. We react to the real world in our own way based on our own knowledge.

Problem: 没则已焉。死亡意味着基因信息的消散。

Language Epoch

language -> Collectiveness -> Society

Collective culture: Division of labor! It is -> it has

With languages, knowledge can be accumulated, shared and learnt. “Knowledge doubling curve”.

Knowledge wouldn’t fade with the death of individual.

Problem: Human is boomed by knowledges.

AI Epoch

“Can machine thinks?” - by turing. 1956 at Dartmouth College in Hanover.

The curse of dimensionality: the connection quantity needed to deal with the input growth faster than exponential curve. 1st winter of AI

AI expert system:

Vacuum tube to IC circuit.

Inventing giant magnetoresistance

Machine learining

AI in the future: Singularity?

Moravec’s Paradox: easy for human, hard for computer, vice visa.

The celling of intelligence

Computational limit

Neuron network is implemented ad a software level, based on von-Neumann architecture.

神经网络架构 vs 冯诺依曼架构

近数据计算、器件级存算一体。

Szilard limit

3.5e8 Tops / W

Can AI go beyond human?

What fills you with awe?

metaphor - there is

Is neuron network universal?

window function.

as long as we have enough memory capacity, neuron network can fit any curve.

We must know, we will know.

Will human think for as humans

The rules of morality are not the conclusion of our reason

Science cannot make value judgement! Human have to seriously design AI’s morality.

One thing nearly for sure, that human brain is not the most powerful processor.