Using ADT to mean “abstract data type” does not mean anything more than what most people mean when they say “type”. Outside of distinguishing from other meanings of the word “type” there is no practical reason to ever say it. It doesn’t sound fancy, it just sounds like getting high on your own supply of acronyms. It is not an important term to learn as a beginner in the first chapters of a programming book. We should stop using it in resources like this.
Otherwise, ADT can also refer to “algebraic data type” which means the ability to compose types together thereby adding or multiplying the different values the resulting type can take. Product types (aka structs) multiply over their fields; two int fields when taken together can take on (# different values of int) * (# different values of int) different values. Sum types add over their variants: a Rust enum “enum OptionalInt { Some(i32), Maybe(i32), None }” can take (# different values of i32) + (# different values of i32) + 1 different values.
Most languages have structs and that’s the “product type” sorted, so ADT generally refers to a language having tagged enums. C doesn’t have that but you can emulate it (quite badly) with unions and enums. Good examples of languages with ADTs are OCaml, Haskell, Rust.
An abstract data type (ADT) is a structured type for which a client cannot access the data components for variables of this type. The client understands the type in terms of its operations, provided by a set of functions which operates on variables of this type. The set of functions is called the interface of the ADT. Note that this definition requires information hiding but not inheritance or polymorphism.
Computers and programs are really complex; programming is really hard; Programming (computer science) will eventually be able to replace human reasoning (and be better at), but the complexity to do that requires deep mathematical knowledge and formal methods (Djikstra was a big fan of formal program proving). Universities aren't teaching computer science, because businesses don't care about that, they just want coders.
I took a few computer science courses at UT back when Djikstra (not from Djikstra himself, though, from Dr. Nell Dale) was there. Everything in the algorithms class came with formal proofs. Loop invariants were core concepts. The book was not yet published, we spiral-bound photocopy of the draft.
We are incrementally replacing human reasoning with computation in the present day. For instance, most static type checkers are weak but fast theorem provers, and type inference replaces some of the human reasoning involved.
Granted, static type checking is a very minor corner case, but manifold small incremental changes add up. It's untrue that human reasoning has never been replaced with automation, and it's untrue that human reasoning isn't currently in the process of being further replaced.
I agree that total replacement of human reasoning is likely any time soon. However, I'd argue that total replacement of human reasoning implies removal of human desires from the input. (What are programs, if not incredibly formal expressions of what humans desire computers to do? How can we divorce human desires from human reasoning about what is good/desirable?) Science fiction provides numerous examples of how a complete decoupling of computers from human desires can go terribly wrong.
Banking on computers to automate all of human reasoning? Sure. Preparing for computers to automate some disproportionately impactful subset of human reasoning, on the other hand, is very reasonable.
Both my undergrad and grad education did not train me to program. I did learn computer science though. Even then... mathematical and algorithmic proofs are in a league of their own. CS has always been applied math as much as physics is.
CS is definitely applied mathematics. Whether or not all of the maths that Dijkstra thought were essential to programming are much use in the day-to-day business of programming is debatable. His curmudgeonly view of our field, from the linked paper:
'As economics is known as "The Miserable Science", software engineering should be known as "The Doomed Discipline", doomed because it cannot even approach its goal since its goal is self-contradictory'
[I've incorrectly put some of the statements from Part 2 in the Part 1 summary, but I've already sunk enough time into summarizing, and the flow feels a bit better this way.]
Part 1: definitions and motivation.
"Radical novelty" describes something new that is so different from everything that came before it that analogical thinking is misleading and new phrases built by analogy with old phrases are inadequate at best. Thinkers in the middle ages were greatly held back by over-use of analogies. There are many cases where modern society has dealt poorly with radical novelty: relativity, quantum mechanics, atomic weapons and birth control pills. The way our civilization has learned to deal with great complexity is to specialize professions, where each profession abstracts away some amount of information, typically expressed in the range of physical scale of their work. The architect deals with a certain amount of complexity: not having to deal with the large scale of the town planner or the small scale of the metallurgist working for the I-beam manufacturer (my interpretation of "solid state physicist" in this context). Computing is radically novel in the scale of complexity handled by a single profession. The terms "software maintenance" (as if time or use alone, rather than shifting requirements, degraded software), "programmers workbench", etc. are evidence that analogies are misleading and software is radically novel.
Part 2: consequences [this summary is more abbreviated than part 1]
History has shown the natural human reaction to radical novelty is to pretend analogies still hold and to pretend that rapid progress isn't possible. We can't keep treating software like it's just some kind of mechanical device and programmers as assembly line workers stamping out widgets. We can't treat software production as just a new type of manufacturing. Manufacturing-style quality control (testing) is a misleading analogy for software quality control, and formal methods are needed for software. Software engineering isn't about designing new widgets, but about fixing the impedance mismatch between humans as assigners of tasks and computers and executors of tasks. There are a variety of vested interests (Mathematicians, businesses, the military, those teaching software development as widget building, etc.) working against advancement of Computer Science as a field. We need to start by fixing our terminology to stop encouraging misleading analogies. ("Software bug" encourages us to think of programmer errors as things that passively happen, like insects crawling into relays, etc.) The job of a software developer is to show that their designs and implementations are correct, not to correctly execute some set of operations that create a software widget.
Such a fatalistic view of human understanding helps no one.
"couldn't" is past tense. The GP clearly expects/hopes there's a quick remedy here, and your reasoning about "can't" in the present tense (and implied continuing future tense) doesn't hold.
European countries created African country borders without an understanding (or care) of the culture and relationships/feuds between a whole continent of people. They put these people together in countries and flipped all of that long history upside down.
I don’t want to diminish the impact of European colonialism, but that’s not really actionable in 2020. Bangladesh, where I’m from, was stripped of capital by the British Empire. Okay, now what? How does that fix the weak rule of law, the political patronage, etc? And it’s not like European countries don’t have a long history of sectarian and ethnic warfare.
It is actionable, the powers that be are simply averse to acting on it. I will never understand the drive to preserve nation constructs that are barely a century old in many cases at all costs.
Why do people like to argue that changing the status quo would cause suffering as though the status quo is not itself causing plenty of suffering? Especially when the chaos "caused" by change is most often actually caused by violent opposition from those desperate to preserve the current state of things?
The disruption is usually not worth it once generations have built their lives and have built their homes based on an existing border regime.
Read about the India Pakistan partition. How would you feel if you have to leave behind everything you have and move to another country because someone decided it is so.
Again, in many cases this is an arrangement that's only about a hundred years old (with about six or seven decades of sovereignty) _and_ is clearly not working out for the people involved. Why do people talk as though they are immutable institutions from prehistoric times?
Plenty of people built their lives on the US being a British colony. Plenty of people built their lives on Austria and Germany being a single country. Plenty of people built their lives on the USSR being a single entity. I could go on and on listing examples. Why is everyone else allowed to naturally form their own national identities but recent (particularly African) colonies are supposed to suck it up and endure the empty ones forced on them?
Perhaps. But what is your actual proposal that will actually make things better? Move the borders? Dissolve countries and go back to being tribes or something? What would that actually fix?
And if that won't fix much, then your talk about sunk cost is pointless.
It's fascinating to me that people talk about a peoples' desire to exercise their right to self-determination (whether autonomist or secessionist) as absurd, especially people whose forebears have already exercised that right to set up the stable societies they benefit from today.
People understand/respect separatism when it's Kosovo, Scotland, Catalunya, Hong Kong, etc, but all of a sudden want to be led by the hand when it involves African nations.
OK, but rayiner's comment was about fixing societies and economies. If you want independence for independence, fine, I've got no problem with that. I merely hope that you either succeed or fail as peacefully as possible.
But the context was about economics, so I presumed that you were saying that it would be economically helpful to change the boundaries.
"Economics" is not a somehow pure and distinct topic from nation building - economies thrive (and vice versa) on the stability of nations. You cannot force economic prosperity out of instability and disunity, so it is very strange to ask how creating more stable/unified nations or autonomous regions is economically helpful.
This obviously does not apply to all sub-Saharan countries - there are many that already had or have managed to forge national identities of their own. I am speaking for those of us who are unevenly yoked and know it.
Fair point. Important point, too. I would merely say that independence gives you the opportunity to create stability. But you still have to make it happen, and it's not as easy as it seems.
You are comparing borders that have mostly had the chance to settle into consenting national identities over the past two thousand years to borders that were forcibly drawn in the late 19th/early 20th centuries (in some places disrupting existing nationalisation processes), as if Europe did not also go through periods of unrest and instability in the wake of dissolving empires.
That is besides the fact that (for example) there are far more extant ethnic groups in my country alone than in all of [Western, if not the entirety of] Europe - the considerations when it comes to building a nation are simply not the same. There's nothing that irks me more in these discussions than "Why haven't you already done what took us centuries to do naturally in the space of a few decades under artificial tension?"
The conflicts in Africa had already existed before the Europeans. I argue that the preexisting conflict is the fundamental cause of the lack of prosperity in Africa, not colonization. Colonialism in Africa only contributed to the problem.
Before America was colonized, there were also political divisions and multiple distinct cultures. Then, a new culture showed up and eliminated all of it. Rightfully or not, it was the establishment of unified central authority that gave America its prosperity. Go look at a map of North America and notice there's only three big countries. In Africa, the situation is essentially a stalemate because there are so many competing factions.
A group of small nations is always going to be more inefficient than one big nation. Just imagine what would happen if you abolished the federal government of the United States.
That's a simplistic view. That might be true in dynamic urban environments where people compete for intangible resources and success is predicated upon individual merit. That's not so much the case when people fight over collective ownership of land, waterways or mines.
You could say the same thing about the whites/blacks in the USA. But even there they've figured out how to make diversity into the "secret sauce" behind prosperity.
Diversity and immigration of talent keeps USA prosperous today. However, the path to prosperity was built on the backs of captured slaves followed by two world wars which left the rest of the world damaged but USA emerged out with a massive industrial infrastructure.