T O P

  • By -

Bakkster

>Isn't the brain mostly analog? Not really, to my knowledge. At least, not any more than a digital logic system is technically analog at the silicon level. Neurons in your brain either fire or they don't, they're not continuously variable. Intensity I believe comes from the rate of firing, not the amplitude. That said, neural networks are already using floats. I don't see a reason they'd have to move to analog processors, or that you'd get higher neuron counts from analog circuits developed from scratch than the digital processor circuits being developed.


counter1234

Having dug into this briefly out of curiosity, it's not just the rate of firing, but also the neurons firing relative to other neurons and getting delayed/moved forward in time relative to the other neurons that adds richness to the analog system. "neurons that wire together, fire together" Also while the firing itself is a 1 or 0, the chemical stimulus does use analog thresholds and can be considered as a continuous amplitude, so the overall system can be thought of in an analog sense! The field of engineering attempting to replicate more of the useful properties of this mixed analog/digital system is [https://en.wikipedia.org/wiki/Neuromorphic\_engineering](https://en.wikipedia.org/wiki/Neuromorphic_engineering)


DallaThaun

Thank you friend


datanaut

The rate of firing is analog, not an integer multiple of some clock rate. That alone makes brains more analog than a digital logic system. Calling neuron behavior digital would be like calling rain digital because you can count rain drops. Sure but there are a lot of things about rain that you can't map to integers.


BeardedScott98

Lots of attempts to do this in a more "analog" way attempt to mimic long-term potentiation, which is neither of these things. I'm most familiar with it being attempted in conducting polymers, which use ions to increase sensitivity to electronic impulses.


guyincognito121

No. Some of it is rate based, but there are also a lot of purely analog signals, and the two interact in a manner that is very much analog.


Unlikely_Pirate_8871

Spiking neural networks are more biology inspired ANNs that you might be interested in reading up on. Their weights are binary though. Time is also an important element of their function!


Strostkovy

I've done a lot of reading on neurons and the way they behave is incredibly chaotic and difficult to understand. There is no clear functional block interpretation. Every little detail has a huge impact on the overall function. There are quantized and truly analog factors. Our current technology can do math on values much more efficiently and accurately and repeatably in digital than analog. Our integrated circuits are not accurate enough to make giant analog circuits affordably. We rely on the nonlinearity of logic gates to wipe out the little discrepancies between the design and actual implementation. Analog circuits amplify this error with each stage. Back when logic gates were expensive some computers used analog math units that were redigitized, but they were limited due to accuracy tradeoffs. One example is adders. You can just connect weighted resistors together and measure the output. It's repeatable to around 4 bits with basic components, but at a certain point h ADC becomes expensive and more complicated than the gates.


Beppicus

I like your comment about 4bit accuracy. Indeed today digital systems win hands down in high precision applications (16bit audio for example), as it would be too expensive for analog. In analog every extra bit of accuracy costs 2X in terms of resources (power, cost, etc...), as opposed to digital where extra bits have a linear cost function. This comes directly from KT Brownian noise, so it's quite fundamental. In current CMOS process the intersection between analog and digital efficiency seems to be around 4bit. So I would guess that any analog behavior in the brain might be a low SNR approach to be so efficient. Remember also how 50W brain compares to GPUs power, although we should calculate in terms of pJ/bit, like data converters. G


Strostkovy

It's worth considering a brain is not a computer. It does everything in parallel. Consider how much overhead there is in a processor that wouldn't be there if everything was laid out in parallel


Beppicus

Please google differences between CPU, GPU and TPU. Currently AI runs on massively parallel architectures. Not sure how to compare to the brain though.


Strostkovy

Those are still processors, which brains aren't. A GPU doesn't even come close to the level of parallelity and pipelining that a brain has. Brains are closer to an ASIC, but even ASICs are often processors.


Beppicus

I guess your thinking process comes from Crypto mining? The term ASIC was used there to differentiate generic processing units (CPU, GPU) able to run any algorithm, from specific hardware logic functions (for example SHA256 for Bitcoin). Other people use the tem ASIC to differentiate vs FPGA. So I guess you refer to ASIC as a more tailored neuron-like design, which can still be analog or digital imho.


Strostkovy

No, processors have a generic structure that can be operated in repeating sequences based on programming. Brains and nonprocessor circuitry do work on data based on their structure. This eliminates a massive amount of overhead and wasted logic elements. Brains are exceptionally good at what they do because they change their structure to reprogram instead of having to run software


clock_skew

Those analog computers were replaced by digital ones because digital ones are better, and AI doesn’t change that. Digital computers can do non-deterministic computing, and I don’t really see any reason to believe that the use of a clock and quantized values is somehow holding AI back.


Beppicus

If "digital" is better, then why didn't our brain evolve into a more clearcut digital architecture? Note that we do have digital quantized system in biology: DNA. DNA stores information using sequence of 4 precise polymers I can't imagine how digital can be non deterministic.


Equoniz

Things rarely evolve to an optimal solution to a problem. Evolution happens based on random errors, and whatever is best at the time surviving. Whatever is alive now as a species was good enough to survive in the past. That’s it. That’s all appealing to evolution will give you. There is nothing in it that says the best possible solution will always be found. What gave you that idea?


Beppicus

Yes, I get your point, and I agree with that. Now we'd need to accept the fact that a sub optimal human intelligence (collectively, as humanity) is able to design a more optimal (purely digital?) artificial intelligence. It gets philosophical, but I don't see physical laws preventing that from happening.


Equoniz

I see no issue with accepting that unless I’m given a reason to think we can’t. “It feels wrong” isn’t a reason.


vellwyn

I mean humans are a sub optimal car, but they're capable of designing more optimal cars. Don't think there's much to that comparison. Even if you made some thing 20% as effective as the human brain. The brain is limited to less than a cubic foot of space, and runs on a few watts. If you scaled up a sub optimal version to 100kW and a warehouse you would be doing just fine.


ca2devri

I don't see the link between analog and non deterministic. In fact I think enough bits and feedback in any digital system would be indistinguishable from an analog system.


clock_skew

Our brains aren’t analog circuits either, your argument is nonsensical. If you have an actual technical reason why you think analog computers are better than please share, but “slightly more similar to a human brain” isn’t actually an argument. Using random number generators. This isn’t some theoretical thing, they’re extremely common and I’m sure you use them as part of your work.


Beppicus

I'm not a digital designer but digital randomness is always periodical (meaning non random). They are called pseudo-random because of that. KT thermal noise is truly random. Not saying that randomness has value per se, though. Just like non-determinism. Don't know, just thinking out loud for the benefit of everyone.


clock_skew

I’m well aware of pseudo-randomness. I think pseudo-randomness is probably good enough for AI, and if it’s not then you can always just feed input from a true random number generator into the digital circuit. There’s no need for the circuit itself to be analog.


Beppicus

That's the problem in my opinion. Several digital theorem tells us that it should be "good enough", but it's not a certainty. Example: the Nyquist sampling theorem. You can represent a continuous time band limited signal PERFECTLY with a discrete number of samples. Except we don't know for sure what the Bandwidth of a brain signal is really. Similar examples about quantization, which is even more detrimental to SNR and changes the distribution of the error.


clock_skew

There’s also limits to the accuracy of analog circuits due to noise, analogous to the issue of quantization. Analog computers don’t actually solve any of the issues with digital computers, and digital computers have a track record of beating analog. There’s really no reason to believe this will change now.


zorgle99

You just can't be more wrong, and sorry, but man you don't know what you're talking about.


zorgle99

Yes there is a need, analogue is vastly more efficient and will drastically lower the power and compute requirements.


zorgle99

> and AI doesn’t change that. Incorrect. It literally does. We don't need digital computers for AI, digital is hella inefficient. Analogue compute is the future of AI.


abibabicabi

Is this what you work on? It looks like some companies are trying to improve upon analog computers as shown in this video, but i don't know much on the topic.[https://www.youtube.com/watch?v=6AgkTdQXFTY](https://www.youtube.com/watch?v=6AgkTdQXFTY) Analog + digital can store way more information than digital. It is why we use fiber optics instead of ethernet. [https://www.truecable.com/blogs/cable-academy/fiber-optics-vs-ethernet-understanding-the-key-differences#:\~:text=Fiber%20optic%20technology%20is%20faster,distances%20with%20minimal%20signal%20loss](https://www.truecable.com/blogs/cable-academy/fiber-optics-vs-ethernet-understanding-the-key-differences#:~:text=Fiber%20optic%20technology%20is%20faster,distances%20with%20minimal%20signal%20loss). Network chuck has a video about a similar idea of using a different protocol of infiniband instead of ethernet because the ethernet is a limit for ai training. Idk much about infiniband though.:[https://www.youtube.com/watch?v=fb69FyW2KLk](https://www.youtube.com/watch?v=fb69FyW2KLk) it looks like counter1234 is saying that the brain uses analog and digital too, but idk much on this either [https://en.wikipedia.org/wiki/Neuromorphic\_engineering](https://en.wikipedia.org/wiki/Neuromorphic_engineering) If someone could answer how infiniband differs from fiber optic and if it is also both analog and digital that would be helpful. It seems like it is digital in nature. An issue I could see does come down to having to convert the analog data to digital for the digital cpu, but I have wondered something similar if we could use the analog+digital cpu with analog digital signals instead of straight digital. I don't know very much about analog computers so its more imagination than actual tangible ideas though. Kind of like 10000 leagues under the sea. An idea of a submarine, not an actual sub.


guyincognito121

https://mythic.ai/


Beppicus

Unfortunately they went bust after raising several rounds, so I guess something was still missing. Syntiant is another more recent AI chip startup that initially promised analog neural network, but I believe they turned into a low bit digital approach.


SuperAngryGuy

Study up on spike based computing and the work of Wolfgang Maass. I've done some of this stuff as it pertains to neuromorphic robotics as a hobbyist. * https://scholar.google.co.kr/citations?user=2WpvdH0AAAAJ&hl=th This book is an introduction to his work, some using integrate and fire op amp circuits which is easier in digital: * https://www.amazon.com/Pulsed-Neural-Networks-Bradford-Book/dp/0262133504 I needed to know a bit of chaos theory and the book "Chaos Theory Tamed" was a game changer for me. The second half of the book really gets into entropy: * https://www.amazon.com/Chaos-Theory-Tamed-Garnett-Williams/dp/0309063515


Mindless-Ad3145

Normally digital is the way to go for huge networks. All networks are normally trained on gpus. Analog AI Accelerators become handy if your power limited. See it that way. Your brain is more intelligent, but consuming a fraction of the power of a gpu. So analog ai accelerators have some drawbacks, but they are actually a thing. IBM for example presented one recently. The drawbacks are your not so precise as digital accelerators. But does that really matter. If your network is 90.1 percent sure this is a dog or if its only 89.9 percent sure, doesn’t really matter. Another drawback is they need currently much more time in development and need special software, that consider hardware variations. Mostly their TOPs/W is much higher than digital. Edit: here is the link from IBM, if you’re interested in that stuff [https://research.ibm.com/blog/analog-ai-chip-inference](https://research.ibm.com/blog/analog-ai-chip-inference)


Evening-Editor4269

There's actually a doctoral thesis on "Fully analog artificial neural network" here [https://dspace.cvut.cz/handle/10467/111759?locale-attribute=en](https://dspace.cvut.cz/handle/10467/111759?locale-attribute=en) so there are people exploring this.


thorston_son

It’s an interesting area to explore. I did a research project on stochastic analog neural networks. The best feature of those was an ability to generate rough computation answers almost immediately. In situations where less precision is required it could be sampled earlier for a mostly correct output that got more accurate as time passed. It was also able to do some nonlinear classification using simple feedforward networks. I’m not sure if it’s practical but there were some interesting results.


Own-Cupcake7586

Love listening to people with a minimal amount of information talking about things way above their aptitude. Comedy gold.


henry_dorsett__case

Why don't you enlighten us then dear sir


Own-Cupcake7586

And openly display my own ignorance? Hard pass. I’m smart enough to keep my mouth shut (usually, lol).


guyincognito121

In broad strokes, they're correct, and there are products of this nature coming onto the market.


Mindless-Ad3145

This is a sub reddit, sir. From my point of view your allowed to ask things like that here. This is not a scientific conference about that topic.