Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A function “call” is a software abstraction. What you’re calling a function call by defining it as stack based is a pure software abstraction. A stack is one way to implement functions, but not the only way.

Capacitors can provide analog immediate access memory. Analog delay lines and feedback circuits can provide conceptually recursive functions, and those are common in analog circuits today.

It’s true that analog has limited precision, though no reason you can’t represent higher precision via multiple analog signals, just like how digital requires more bits. All the rest of it still looks to me like you’re making some incorrect assumptions about what’s possible with analog circuitry.



A function call is indeed a software abstraction. Outside of software nobody "calls" functions at all. That was my entire point.

In mathematics you have function application which is close to the idea of the function call. And this idea also doesn't exist in analog computing. Ironically, Mathematics does model what's going on in an analog computer on a field called signal processing. In signal processing it's called a transform.

>It’s true that analog has limited precision, though no reason you can’t represent higher precision via multiple analog signals,

You can't. The only way to do this with multiple signals is to make each signal represent a digit or a portion of digits of the final value. But if you did this you'd be going digital. You likely won't be doing binary but it's still digital.


> You can’t

Speak for yourself. ;) Depends on how you define “digital”. Digital is typically defined as being based on logic gates, not necessarily anything to do with your representation of numbers. I say if your adder is built with op-amps and multiple lines, it’s doing analog computation on digits, and it stays analog for longer than if you build it with CMOS gates.

This distinction is important when you start making signal integrators or matrix multipliers or do other computations with analog components, for example.


> Depends on how you define “digital”. Digital is typically defined as being based on logic gates, not necessarily anything to do with your representation of numbers.

This is 100% false. You are categorically wrong here. https://www.britannica.com/technology/digital-computer

It is not based off of logic gates. It is computing based off of discretized data and discretized steps. Given that computing is usually about numbers, the numbers and the steps to process those numbers in digital computing become discretized.

In analog computing nothing is discretized. The numbers and the processing of numbers is continuous.

You're very very much misinformed here.


>Depends on how you define “digital”

Dude, I don't define "digital"... the ENGLISH language defines the word. Read my other reply. You're confusing your made up definition of digital with the one defined by English.

You're just reinventing a digital computer with parallel data lines and not using binary. It's not even getting around to the "fuzzy" part of what it means to be an "analog computer." It's just a misunderstanding on your part.


> the ENGLISH language defines the word

Hehehe Huh?? English doesn’t define itself, people define words by how they’re used. Okay so if you want to make the case for the English definition, let’s see…

Digital 1. of, relating to, or utilizing devices constructed or working by the methods or principles of electronics : ELECTRONIC digital devices/technology also : characterized by electronic and especially computerized technology the digital age

3: providing a readout in numerical digits a digital voltmeter a digital watch/clock

6: of or relating to the fingers or toes

Okay, got it, anything electronic or anything with fingers & toes. Sounds like all analog electronic circuitry passes for digital according to the English language. :) I mean that’s actually true, most people do use “digital” as synonymous with “electronic”. That is why this definition is the first one: it’s the most used and de-facto the most correct, because that’s how the English language works, but you obviously know that already. Also sounds like the THAT computer in the article is defined as digital, because it has a digital readout, bonus! You should write to the IEEE and Anabrid corporation to have them correct their titles.

https://www.merriam-webster.com/dictionary/digital

I wasn’t really intending to talk about how to define the word digital, even though I see that’s what the sentence literally suggests out of context. I was trying to repeat the same thing I said earlier with slightly different words. I was talking about where you draw the line between analog and digital in order to summarize any given device as being one or the other, when in fact many devices are mixed. I’m not contradicting what you’re saying about digital (conversely, you’re not contradicting me either).

Here’s an example of research into purely analog computation with parallel data lines for the purpose of increasing precision, exactly what I was talking about and it has at least one name: residue number system.

https://arxiv.org/abs/2306.09481

If you slowed down a little instead of making so many assumptions and attacking, we could have a productive conversation. You’re misunderstanding me and stating things in binary terms it comes to analog or digital, and still not acknowledging or understanding that there is a lot of space between fully analog and fully digital, and a spectrum of cross-over points and hybrids.


Your cherry picking definitions. We are talking about digital computers not some colloquial usage of the word digital like in the term digital watches. This is about digital computers and analog computers and when people talk about these things in those terms they exactly mean the britannica definition I gave you.

>If you slowed down a little instead of making so many assumptions and attacking

You are the one that needs to slow down bro. I never made a single attack. I simply only remarked on your statements stating if they are wrong or their right or if you don't understand. I never made an attack on your personal character. This accusation out of nowhere, if anything this statement is the one closest to crossing the line. You need to take a chill pill and relax. There are no attacks here.

>You’re misunderstanding me and stating things in binary terms it comes to analog or digital, and still not acknowledging or understanding that there is a lot of space between fully analog and fully digital, and a spectrum of cross-over points and hybrids.

There is a lot of space but this is where you are completely wrong because we aren't just talking about digital and analog. We are talking about digital and analog computing. Key word: computing. I'm not being pedantic here. The colloquial usage of the term "digital computing" and "analog computing" exactly means the definition on britannica. Most people who know what they are talking about will use it this way and I'm informing you that you are out of the loop and you don't get it.

Then you misinterpret that as an attack when it's anything but.

>https://arxiv.org/abs/2306.09481

This paper isn't about what the analog vs. Digital computing debate we are having. It doesn't define what they are doing as digital or analog so it doesn't lend support in either direction. It's simply optimizing an analog process by digitizing analog values in another number system. Rns is a mathematical concept it is not the name of the overall technique they are utilizing here.

This is a hybrid, which I also talked about earlier. But you are wrong to call this new thing an "analog computer". The benefits are inline with what I stated originally. Only for speed and energy. Remember your earlier claim was about analog computing but here they are literally feeding that analog line into an ADC. Which again supports my point: the only way to increase precision of multiple analog signals is to digitize it. This is not a purely analog computation.


> the only way to increase precision of multiple analog signals is to digitize it

What do you mean? Digitizing doesn’t increase the precision of anything, it always loses precision. I don’t think you’re summarizing that paper accurately, they are most certainly talking about using analog compute units, specifically analog matrix-vector-multipliers, and a method for increasing the analog signal precision before digitizing the signal. The components used are analog compute circuits, and the title of the paper has “analog” in it. This is absolutely relevant to what I was trying to say because RNS is used here to make multiple analog data lines representing a value provide increased precision compared to a single analog data line.


No they digitize the signal to get the modulos. Look at the diagram.

It's not relevant because there are digital operations need to get the final result BEFORE they do the final ADC conversion. It's like saying analog transistors are used to make digital gates therefore are all computers with gates are analog.

They are talking about an analog MVM in a system that is doing hybrid computations both digitally and with analog systems. It completely fails to support your point of increasing precision in an analog computer with multiple analog signals. It's basically the same thing as using one analog signal per digit instead they're using one signal per modulo.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: